• 0 Posts
  • 99 Comments
Joined 7 months ago
cake
Cake day: October 19th, 2024

help-circle



  • Yes, the main problem with developing AI is that we really don’t understand how we think. Current AI doesn’t understand anything, it just imitates human output by processing a vast amount of existing output. But we do know a lot more now about how we think, understand and speak than we did a hundred years ago, and as a linguist you know this work isn’t standing still,. Compare it with genetics - 70 years ago we didn’t even know about DNA, and now we can splice genes. The fact that there’s still a lot of baseline work to do shouldn’t cast doubt on the goal, should it?


  • AI can’t replace programmers right now, but I’ve said all through my software dev career that our ultimate goal is to eliminate our jobs. Software will eventually be able to understand human language and think of all the right questions to ask to turn “Customer wants a button that does something” into an actual spec that generates fully usable code. It’s just a matter of time. Mocking AI based on what it currently can’t do is like mocking airplanes because of what they couldn’t do in the 1920s.









  • I actually don’t write code professionally anymore, I’m going on what my friend says - according to him he uses chatGPT every day to write code and it’s a big help. Once he told it to refactor some code and it used a really novel approach he wouldn’t have thought of. He showed it to another dev who said the same thing. It was like, huh, that’s a weird way to do it, but it worked. But in general you really can’t just tell an AI “Create an accounting system” or whatever and expect coherent working code without thoroughly vetting it.