• 0 Posts
  • 17 Comments
Joined 1 year ago
cake
Cake day: July 15th, 2023

help-circle





  • If I was still in a senior dev position, I’d ban AI code assistants for anyone with less than around 10 years experience. It’s a time saver if you can read code almost as fluently as you can read your own native language but even besides the A.I. code introducing bugs, it’s often not the most efficient way. It’s only useful if you can tell that at a glance and reject its suggestions as much as you accept them.

    Which, honestly, is how I was when I was first starting out as a developer. I thought I was hot shit and contributing and I was taking half a day to do tasks an experienced developer could do in minutes. Generative AI is a new developer: irrationally confident, not actually saving time, and rarely doing things the best way.






  • If it was the best healthcare in the world, we’d have the best outcomes and we don’t even have that for rich people. We have a (non-metric) shit ton of world class research universities and highly respected agencies like the FDA and NIH but Elon Musk, the richest man in the world, can’t even get the mental health services he obviously needs.

    I’d obviously rather go to an American hospital than a hospital in most of the world but spending a lot to cover up a shitty system isn’t as good as a functioning system.

    Edit: I originally had NHS instead of NIH but the NHS, is, obviously, where British people get their brain medicines.


  • I’m a developer posting on Lemmy so maybe take this with a huge grain of salt but I think we need to focus less on STEM/finance and more on humanities education. Definitely in the United States but probably most of the world considering India and China focus on tech too.

    When I was learning to code (in the 90’s and 2000’s unless you count a 9 year old making BASIC do loops), my mentors basically all had majored in something besides computer science because there wasn’t necessarily even a computer science major available if your college didn’t have “Tech” in the name. It was a lot of hippies who spent their weekends making pottery and got into IT or software development almost by accident; it was a job to fund their non-lucrative hobby or passion.

    Basically, we lost something when being a programmer became a goal and not a way to reach some other goal. I’m not sure we can return to a time when it was tinkerers and hobbyists coming to the field with different backgrounds but more creatives should learn to code and more coders should be forced to make art.






  • New Orleans did a pilot with allowing facial recognition for major felonies and it just didn’t work at all. Of the 15 requests, 9 failed to make a match and of the 6 that did return a “match,” 3 were the wrong person.

    That’s a small sample size — most cities don’t release data — but it explains why cities that happily use facial recognition software don’t see reductions in crime or cleared cases. It’s just a complete waste of money and the investigators’ time. Facial recognition tech can (usually) identify friends in photos but criminals aren’t posing for fancy modern phone cameras in decent lighting. They’re using security cam stills and anyone committing a major felony probably has their face at least partially covered.

    It’s like that software that’s supposed to identify gunshots but has so many false positives, police stop even bothering to follow-up after awhile. Maybe not as stupid as the NYPD buying robots but still a huge waste of resources.