• 0 Posts
  • 34 Comments
Joined 1 year ago
cake
Cake day: July 30th, 2023

help-circle

  • ZzyzxRoad@lemm.eetoScience Memes@mander.xyzinternet points
    link
    fedilink
    English
    arrow-up
    20
    ·
    1 year ago

    Like the other response to this said, it’s a little more complicated than “the status quo is easier” or “intelligent doesn’t mean smart.” This is a deeply ingrained system that’s existed for a long time, and if you don’t operate within it, you don’t get to work in academia. You won’t get to conduct your research to begin with, much less will you get to the point of publishing it without cooperating with these institutions. There are also powerful regulatory bodies like the APA and AMA who control just about everything in their field. You pretty much have to work for a university, and US universities are of course greedy and corrupt in their own right.

    It would be like unseating the DNC, ending the electoral college, and expanding the two party system in America, but on a smaller scale. Plenty of Americans know that these things need to happen, but it’s not something where you can just wake up one day and make the decision to overthrow the system as long as you just try real hard.





  • Seeing technology consistently putting people out of work is enough for people to see it as a problem. You shouldn’t need to be an expert in it to be able to have an opinion when it’s being used to threaten your source of income. Teachers have to do more work and put in more time now because ChatGPT has affected education at every level. Educators already get paid dick to work insane hours of skilled labor, and students have enough on their plates without having to spend extra time in the classroom. It’s especially unfair when every student has to pay for the actions of the few dishonest ones. Pretty ironic how it’s set us back technologically, to the point where we can’t use the tech that’s been created and implemented to make our lives easier. We’re back to sitting at our desks with a pencil and paper for an extra hour a week. There’s already AI “books” being sold to unknowing customers on amazon. How long will it really be until researchers are competing with it? Students won’t be able to recognize the difference between real and fake academic articles. They’ll spread incorrect information after stealing pieces of real studies without the authors’ permission, then mash them together into some bullshit that sounds legitimate. You know there will be AP articles (written by AI) with headlines like “new study says xyz!” and people will just believe that shit.

    When the government can do its job and create fail safes like UBI to keep people’s lives/livelihoods from being ruined by AI and other tech, then people might be more open to it. But the lemmy narrative that overtakes every single post about AI, that says the average person is too dumb to be allowed to have an opinion, is not only, well, fucking dumb, but also tone deaf and willfully ignorant.

    Especially when this discussion can easily go the other way, by pointing out that tech bros are too dumb to understand the socioeconomic repercussions of AI.












  • I think their question is more about how we would implement that. Marx believed that proletariat uprising would be the “how,” and that it is an inevitability of end stage capitalism. But the nature of capitalism keeps people from attempting that. This is a system that we are forced to participate in if we want to survive. We need food and shelter and we don’t want to get arrested and/or murdered by cops for revolting. With that in mind, we have to get to a point where we collectively have nothing left to lose.



  • Here’s a somewhat tangential counter, which I think some of the other replies are trying to touch on … why, exactly, continue valuing our ability to do something a computer can so easily do for us (to some extent obviously)?

    My theory prof said there would be paper exams next year. Because it’s theory. You need to be able to read an academic paper and know what theoretical basis the authors had for their hypothesis. I’m in liberal arts/humanities. Yes we still exist, and we are the ones that AI can’t replace. If the whole idea is that it pulls from information that’s already available, and a researcher’s job is to develop new theories and ideas and do survey or interview research, then we need humans for that. If I’m trying to become a professor/researcher, using AI to write my theory papers is not doing me or my future students any favors. Ststistical research on the other hand, they already use programs for that and use existing data, so idk. But even then, any AI statistical analysis should be testing a new hypothesis that humans came up with, or a new angle on an existing one.

    So idk how this would affect engineering or tech majors. But for students trying to be psychologists, anthropologists, social workers, professors, then using it for written exams just isn’t going to do them any favors.


  • https://www.nationalpriorities.org/budget-basics/federal-budget-101/spending/

    By far, the biggest category of discretionary spending is spending on the Pentagon and military. In most years, this accounts for more than half of the discretionary budget. In 2020, because some discretionary spending passed through supplemental appropriations went to pandemic programs, the share of the discretionary budget that went to the military was smaller – even though the amount that went to the military was just as high as in previous years.

    Most “welfare” falls under discretionary. Medicare, medicaid, and social security (also “welfare”) fall under mandatory spending. Social security and medicare make up the largest categories. This organization explains how “welfare” spending increased in recent years due to pandemic spending on things like stimulus checks and increased unemployment.

    The bottom line thoughis that people pay into it for years so that it’s available when it’s their turn to need it. If they never do, then great. It can help someone else, god forbid.