• 0 Posts
  • 11 Comments
Joined 2 years ago
cake
Cake day: June 17th, 2023

help-circle

  • Yes, what I’m saying is that lower costs for software, which AI will help with, will make software more competitive against human production labor. The standard assumption is that if software companies can reduce the cost of producing software, they’ll start firing programmers but the entire history of software engineering has shown us that that’s not true as long as the lower cost opens up new economic opportunities for software users, thus increasing demand.

    That pattern stops only when there are no economic opportunities to be unlocked. The only way I think that happens is when automation has become so prevalent that further advancement has minimal impact. I don’t think we’re there yet. Labor costs are still huge and automation is still relatively primitive.


  • One thing that is somewhat unique about software engineering is that a large part of it is dedicated to making itself more efficient and always has been. From programming languages, protocols, frameworks, and services, all of it has made programmers thousands of times more efficient than the guys who used to punch holes into cards to program the computer.

    Nothing has infinite demand, clearly, but the question is more whether or not we’re anywhere near the peak, such that more efficiency will result in an overall decrease in employment. So far, the answer has been no. The industry has only grown as it’s become more efficient.

    I still think the answer is no. There’s far more of our lives and the way people do business that can be automated as the cost of doing so is reduced. I don’t think we’re close to any kind of maximum saturation of tech.


  • Yeah there’s definitely some overlap. Lots of dark UX is used for enshittification but sometimes enshittification is just plainly bold bad UX for the sake of making money with a hint of “Yeah it’s bad. What are you going to do about it?”

    On the other hand, enshittification is part of a cycle that starts with a service that grows dominant at least in part by providing a great experience, only to tear that experience down when it gets in the way of making money. Dark UX isn’t always part of that cycle. Plenty of services of all sizes use these patterns right from the start. Not really accurate to call it “enshittification” when it was always just shit.





  • The above is pretty misleading. A typical Java program can be made into a Kotlin program with little changes, this is true. But Kotlin code, particularly when written using Kotlin best practices, bares very little resemblance to Java code. If you learn Kotlin first, you’ll find some of that knowledge does transfer to Java but then there’s plenty that won’t and you’ll have to learn the Java way of doing things too. Still, as a dev, knowing more languages never hurts. I’d still recommend proceeding with Kotlin.


  • The logger analogy is a misunderstanding of what people with a degree in CS do. Most become software engineers. They’re not loggers, they’re architects who occasionally have to cut their own logs.

    They’ve spent decades reducing the amount of time they have to spend logging only to be continually outpaced by the growth of demand from businesses and complexity of the end product. I don’t think we’ve reached a peak there yet, if anything the capabilities of AI are opening up even more demand for even more software.

    But, ultimately, coding is only a fraction of the job and any halfway decent CS program teaches programming as a means to practice computer science and software engineering. Even when an AI gets to the point that it can produce solid code from the English language, it has a ways to go before replacing a software engineer.

    One thing that’s for sure: tons of business owners will get richer and pay fewer workers. I think we’re going to have to face a reckoning as we reach the limits of what capitalism can sustain. But it’s also unpredictable because AI opens up new opportunities for everyone else as well.