• 0 Posts
  • 102 Comments
Joined 2 years ago
cake
Cake day: June 26th, 2023

help-circle
  • FP & OOP both have their use cases. Generally, I think people use OOP for stateful programming, and FP for stateless programming. Of course, OOP is excessive in a lot of cases, and so is FP.

    OOP is more useful as an abstraction than a programming paradigm. Real, human, non-computer programming is object-oriented, and so people find it a natural way of organizing things. It makes more sense to say “for each dog, dog, dog.bark()” instead of “map( bark, dogs)”.

    A good use case for OOP is machine learning. Despite the industry’s best effort to use functional programming for it, Object oriented just makes more sense. You want a set of parameters, unique to each function applied to the input. This allows you to use each function without referencing the parameters every single time. You can write “function(input)” instead of “function(input, parameters)”. Then, if you are using a clever library, it will use pointers to the parameters within the functions to update during the optimization step. It hides how the parameters influence the result, but machine learning is a black box anyway.

    In my limited use of FP, I’ve found it useful for manipulating basic data structures in bulk. If I need to normalize a large number of arrays, it’s easy to go “map(normalize, arrays)” and call it a day. The FP specific functions such as scan and reduce are incredibly useful since OOP typically requires you to set up a loop and manually keep track of the intermediate results. I will admit though, that my only real use of FP is python list comprehension and APL, so take whatever I say about FP with a grain of salt.


  • I think ‘implies’ asks whether it’s possible that A causes B to be true. In other words, it is false if there is evidence that A does not cause B.

    So:

    If A is true and B is false, then the result is false, since A could not cause B to be true.

    If A and B are both true, then the result is true, since A could cause B.

    If A is false and B is true, then the result is true since A could or could not make B true (but another factor could also be making B true)

    If A and B are both false we don’t have any evidence about the relationship between A and B, so the result is true.

    I don’t know for sure, though. I’m not a mathematician.


  • I personally think the change from master & slave was kind of silly, as far as I’m aware, it was a bunch of people with no background in CS who thought the application of the term to something that has neither race nor agency was an insult to black people.

    But I digress. It led to better guidelines in the Linux kernel, which I think are useful. You should tailor the terms you’re using to the specifics of the task. If you have a master process that only has outward interfaces through the slave processes, you could use the term ‘director’ and ‘actor.’ if the master process is managing slave processes which compete over the same resources, you can use the terms ‘arbiter’ and ‘mutex holder.’ If the slaves do some independent processing the master does not need to know the details of, you can use the term ‘controller’ and ‘peripheral.’

    Basically, use a term that is the most descriptive in the context of your program.

    Edit: also, I don’t know why no one mentions this, but you can also use master/servant. Historically, there wasn’t a difference between servant and slave, but in modern days there is, so it’s technically different, technically the same.


  • It’s interesting to see how propagandized the story is, according to the penguin & pufferfish. (I forget if they even have names) Apparently, they were apathetic cannibals who only left their bunker so they could eat more people and fuck up the world.

    My best guess of the actual situation would be the mata inheritors were narcissistic and thought they would be the saviors of the destroyed world, quietly steamrolling all of the other survivors in their attempt to ‘rebuild civilization’ and essentially colonize the apocalypse.

    Penguin & pufferfish, likely being descendants of the unmata survivors, inherited the original survivors’ perspective on the events.





  • What’s your preferred default pronoun? As far as I’m aware, there isn’t a universally accepted replacement, since any pronoun comes with drawbacks. ‘he’ & ‘she’ are gendered, ‘it’ typically refers to non-sentient things, and ‘they’ can cause confusion about number. Of course, there’s also neopronouns, but people have come up with a billion, and there’s no consensus or standard, so I can’t confirm the person I’m talking to will understand.





  • From my experience, being “good” at vibe coding is more about being unable to detect flaws in AI generated code rather than being able to code well. Add AI to the workflow of someone who actually understands scalability and maintenance and that won’t be able to get past a couple functions before they drop the AI.

    Also, assuming this kid gets weekends off, he would be writing 12k lines of code each day. I don’t think the average programmer could even review that number of lines in a day, so there’s likely no actual supervision for what the kid is feeding into the codebase.

    I’d estimate within four months the project will be impenetrable, and they’ll scrap the whole thing.



  • After reading a lot of comments in this thread, I’m not sure I know what spaghetti code is. I thought spaghetti code was when the order of execution was obfuscated due to excessive jumps and GOTOs. But a lot of people are citing languages without those as examples of spaghetti code. Is this just a classic “I don’t like this programming language, and I don’t know much about it.” Or is there something I’m missing?





  • That’s not what I’m saying at all. What I’m trying to say is that I can’t think of any way a program working with numeric types could start outputting string types. I could maybe believe a calculator program that disables exceptions could do that, but even then, who would do that?