• 0 Posts
  • 52 Comments
Joined 2 years ago
cake
Cake day: April 21st, 2024

help-circle




  • I think a good way to deal with this would be assignments that also (partly) prepare you for the written exam. So if you sit down and do it yourself and actually understand the assignment you already did some learning for the exam.

    I had one course at university with homework assignments that were super tough. I did it all by myself but in the end I learned so much that I didn’t even need to study for the written exam and got a top grade. Others who did not do their assignments on their own had to study hard for the written exam and in most cases got way worse grades or failed.












  • I think that there is AI “art” that goes beyond typing a few words into chat gpt and waiting for a result.

    I don’t know how popular this is today but about two years ago I watched lots of people go wild with stable diffusion workflows. It was a whole palette of tools: Control net, Inpainting, sketches with img2img for the composition, corrections in Photoshop and so on. It took hours or days of manual work until people “generated” the image that they initially imagined. I would say that this would count as art… Writing one prompt into your favourite llm and take what you get: not so much.

    One example for reference: https://youtu.be/K0ldxCh3cnI



  • I generally agree.

    Imagine however, that a machine objectively makes the better decisions than any person. Should we then still trust the humans decision just to have someone who is accountable?

    What is the worth of having someone who is accountable anyway? Isn’t accountability just an incentive for humans to not just fuck things up? It’s also nice for pointing fingers if things go bad - but is there actually any value in that?

    Additionally: there is always a person who either made the machine or deployed the machine. IMO the people who deploy a machine and decide that this machine will now be making decisions should be accountable for those actions.