• 0 Posts
  • 37 Comments
Joined 1 year ago
cake
Cake day: July 5th, 2023

help-circle
  • test113@lemmy.worldtoProgrammer Humor@programming.devEvery Family Dinner Now
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    10 months ago

    I never argued that I was in IT/Tech; I deal with investments and PE. I have nothing to do with IT or tech. My point is we, in the PE/FO sector, are going to invest in AI businesses in 24/25, not only in the “B2C market” but mainly in the B2B market and for internal applications. Whether you believe it or not, it’s gonna happen anyway.



  • test113@lemmy.worldtoLemmy Shitpost@lemmy.worldA small problem
    link
    fedilink
    English
    arrow-up
    18
    ·
    10 months ago

    It’s a birth defect - Big Ed Brown from 90 Day Fiancé has Klippel-Feil syndrome, which makes his body look different from others.

    I don’t watch it, but making fun of someone for his looks, which he can’t control, is a doozy, so I hope they laugh because of his antics and not his body. Would be kinda cheap otherwise.


  • In other words, media as a “service” makes more money than media as a one-point sale. Why should they sell you a one-point solution when the service model makes more money for the shareholders? I love the shareholder economy; it makes all our lives better and makes us focus on what really matters at the end of the day, which is, of course, profits for people who already have too much money. :) very cool



  • test113@lemmy.worldtoProgrammer Humor@programming.devEvery Family Dinner Now
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    3
    ·
    11 months ago

    Again, none of the people at this talk have anything to do with selling a product or pushing an agenda or whatever you think. There is no press, there is no marketing, there is no product - it was basically a meetup of private equity firms that discussed the implementation and impact of purpose-trained AI in diverse fields, which affects the business structure of the big single-family office behemoths, like an industry summit for the private equity sector regarding the future of AI and how some plan to implement it (mainly big non-public SFOs).

    Sometimes people just meet to discuss strategy; no one at these talks is interested in selling you anything or buying anything - they are essentially top management and/or members of large single-family offices and other private equity firms. They are not interested in selling or marketing something to the public; they are not public companies.

    It’s weird how you guys react; not everything is a conspiracy or a marketing thing. It’s pretty normal in private equity to have these closed talks about global phenomena and how to deal with it.

    These talks are more to keep the industry informed. I get that you do not like it when essentially the big SFOs have a meeting where they discuss their future plans on a certain topic, but it’s pretty normal that the elite will arrange themselves to coordinate some investments. It’s essentially just the offices of the big billionaire families coming together to put heads together to discuss a topic that might influence their business structure. But, in no way is it a marketing strategy; it would, on the contrary, be negatively viewed in the public eye that big finance is already coordinating to implement AI into their strategy.

    But feelings don’t change facts. My point is if the actual non public big players are looking at AI in a serious matter, then so should you.


  • test113@lemmy.worldtoProgrammer Humor@programming.devEvery Family Dinner Now
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    2
    ·
    11 months ago

    Haha, lol, whats happening why do you hate me, just sharing an experience, an opinion?

    • it’s not NVIDIA or AMD or any chip manufacturer, or someone who has a product to sell to you. Most of them are not even publicly traded but are organized in family office structures. They don’t care about the B2C market at all; they are essentially private equity firms. You guys interpret anything to fit your screwed-up vision of this world. They don’t even have a product to sell to you or me; it was a closed talk with top industry leaders and their managers where they discussed their view of AI and how they will implement purpose-trained AI into manufacturing, etc. It has nothing to do with selling to the public.

    I have already said too much - just let me tell you if you think LLMs are the pinnacle of AI, you are very mistaken, and depending on your position in the market, you need to take AI into account. You can only dismiss AI if you have a position/job with no real responsibility.

    So weird how you guys think everything is to sell you something or a conspiracy - this was a closed talk to discuss how the leaders in certain industries will adapt to the coming changes. They give zero cares about the B2C market, aka you as an individual.

    Again, none of the people at this talk have anything to do with selling a product or pushing an agenda or whatever you think. There is no press, there is no marketing - it was basically a meetup of private equity firms that discussed the implementation and impact of purpose-trained AI in diverse fields, which affects the business structure of the big single-family office behemoths.


  • test113@lemmy.worldtoProgrammer Humor@programming.devEvery Family Dinner Now
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    25
    ·
    edit-2
    11 months ago

    Hi, I don’t want to say too much, but after being invited to some closed AI talks by one of the biggest chip machine manufacturers (if you know the name, you know they don’t mess around), I can tell you AI is, in certain regards, a very powerful tool that will shape some, if not all, industries by proxy. They described it as the “internet” in the way that it will take influence on everybody’s life sooner or later, and you can either keep your finger on the pulse or get left behind. But they distinguished between the “AI” that’s floating around in the public sector vs. actual purpose-trained AI that’s not meant for public usage. Sidenote: They are also convinced the average user of a LLM is using it the “wrong” way. LLMs are only a starting point.

    Also, it’s concerning; I’m pretty sure the big boys have already taken over the AI market, so I do not trust that it will be to the benefit of all of us and not only for a select group (of shareholders) that will reap the benefits.



  • test113@lemmy.worldtoMemes@lemmy.mlYouTube
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    11 months ago

    Yeah, I know that, XD but why?

    What makes it so that you think you should be able to get creators and their content, server capacity, and storage for free? Who should be paying for it in your mind? Who should eat the cost? The creators, the platform, or the user? or all of them to a degree? And who should be able to profit?

    I think it’s pretty clear that the end-user will carry most of the cost in the end.


  • test113@lemmy.worldtoMemes@lemmy.mlYouTube
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    11 months ago

    YouTube cannot do that. YouTube’s content legal system does not allow this.

    That said, I use SponsorBlock and love it to the degree of finding it necessary depending on what type of content I am watching.

    Why do people hate YouTube Premium anyway? I don’t quite get it. I have had it since it was available in my country, and I love it.

    Also, I have to say I use the YouTube Vanced app with SponsorBlock and custom layout (no shorts, no uploads, no etc.) and YouTube Premium subscription. I don’t like the default YouTube app.

    So, I don’t know if I like YouTube or just the model and content/creators behind it.


  • I’m not so sure – YouTube is much larger than you might think. It’s not the video platform you grew up with anymore. No one in this world can match the backlog and content density/diversity of YouTube, not even all streaming services combined. People complaining that YouTube is dying because a few YouTubers “retire” from their main gig or that it’s not the same anymore don’t understand how YouTube works. They might not comprehend that the time of their “bubble” has come to an end. When this happens, there are already five new bubbles/niches that are even bigger, and you might not have heard of them, but they are more successful than their “predecessor.” The old bubble is still there to consume in the backlog. Someday in the future, AI will have a field day with the data accumulated via YouTube.

    It is transforming, for sure, but I don’t think it will destroy itself completely. In a sense, you can say it will destroy whatever view you had of YouTube as a platform because it is not what it once was.

    To my knowledge, YouTube will hit the billion-user milestone this year (Netflix currently at ~250 million paid users). If we look at other data trends from streaming services, it suggests that YouTube will grow more over the coming years. I don’t know how anyone can match YouTube as a whole. In certain niches, sure, but as a whole, it would be like fighting windmills. There’s a reason no one tries to tackle YouTube as a platform and only goes for certain niches.


  • I mean, if I were an investor looking at this, I would also get excited about making this change - much less risk, less cost, less customer support, etc., all for basically the same output in revenue. In other words, if I cut the small business (6% of value but over 100k accounts to handle) out of the model, I can make more money because the cost reduction is higher than the loss of revenue. And in the long run, when “big game customers” jump ship, I just downsize some more. I also don’t need to invest but can be sure it will generate a certain amount of revenue, as long as I do not squeeze the relevant customer groups too hard. This strategy is very feasible and relatively risk-free. I am not a fan of it, but I think a lot of software companies will go this way after they establish themselves in a market.


  • Well, I hope you are right. xd

    It just seems to me like a monopolization of the market by the big tech corps, which won’t be beneficial to the majority, but at least a few billionaires will get richer.

    I was recently invited to the Google research center where they presented their new AI assistant features, which should be coming this year. It was weird; it was at the same time more capable than I thought and more restrictive than one would assume. It’s like not even Google knows exactly what to do with it, or what it should be able to do, or what exactly it is capable of. I also once got to try an “uncensored” / “unrestricted” information model, which was actually a bit scary but far more useful than any of the current “restricted” chatbots. I’m sure AI will change things up, but how, when, and why I don’t know, and the more I find out, the more unsure I am about predictions, besides the one that big corps will try to monopolize the market.


    1. Why do you care this much about online comments in such a niche community where only already opinionated people are?

    2. Yeah, if I were a moderator and needed to go over 1000 comments in today’s climate, I would delete more than necessary just because you never know. They do not put as much thought into it as you think. It was most likely just like this:

    A mod goes over comments that got reported, reads the first line of the comment, sees it has direct insulting language (the “fuck them” line), and deletes it.

    No political intent or conspiracy, just a mod being a mod. Could be that there is some bias, but then you can do nothing anyway in that case; it’s just a small echo chamber then.

    Hakuna Matata, my friend.


  • The funny thing is, copyright doesn’t even matter; at least half of the world’s market couldn’t care less about copyright, especially if it’s from the “west.” They certainly won’t suddenly start respecting copyright law. They will use and develop AI without the restriction of copyright. All this talk about copyright and the law, and all the copyright suits against AI and tech firms, will be fruitless since we either forget copyright like we used to know it, or we get left behind in development because we need to respect the copyright of everything and make contracts with every big outlet, etc. Big tech knows that, so they walk this gray zone walk to still train AI on copyrighted material but somehow proclaim they are not copyright-dependent.

    I’m not saying this is a good development, just that I think we need to reassess how we treat copyright on a fundamental level under the current development structure of AI.

    We need to slow down the development of AI and hinder monopolization of the market. My guess is it’s too late, but we can still hope that maybe this time it will be different.


  • What issue? That unpaid interns or those one step below are not agreeing with long-term political decisions that were practically made before they were born and only understand the surface of the subject?

    Yeah, thanks. I think I’ll just ignore those as well if I were in a position of power, and you would too.

    What is this “moral responsibility,” and why is it just now relevant? There were, are, and will be much bigger and worse issues, like climate change, but no one is talking about moral responsibility and blasting the ones who are in charge like it is happening right now with the Israel/Palestine crisis.

    Maybe it is just the age of massive misinformation and propaganda campaigns from all sides (some are engaging much more than others) with which I have a problem. Because, in the end, I applaud people who stand up for what they think is right, like those interns. It just comes across as too selective to be a principle. I mean, the Israel/Palestine issue has been ongoing for what? 50 years? It’s not even the first hot phase or siege of Gaza. And then you start working in politics and then you became aware of the politics and stopped working there? What?


  • This happens rarely, and if it happens, the chances of it being someone we know from the media are almost 0; that would be all under the table. The “best” are the ones you don’t hear about because they are too busy working on actual stuff, same in most science fields.

    Most of them are recruited in “normal” ways; there is much more talent around these days. No need to engage with criminals and put them on actual sensitive stuff. Also, they get paid more than you might think; the people leading these projects are not stupid and make a simple mistake like underpaying talent they still need.



  • oh yeah for sure could be a reaction to the opium wars

    It’s never the drugs that make a society erode; it’s a symptom. If you have a big drug problem in a country, most likely it’s related to much bigger issues at the core. Like in the Opium Wars, it was the British Empire that basically drugged China as a means to get what they want. It’s not like they discovered drugs and then just stopped doing anything else; we humans had drugs and used drugs since we know about them.

    Some argue this tactic is still very much in use today, hence the fentanyl crisis, which seems to be fueled by China. It’s a destabilizing tactic. That’s also part of why China and other Asian countries are so strict because they know firsthand the effectiveness of literally drugging your foe to gain an advantage. This does not mean China and co do not have their own drug market; they have a pretty vivid drug scene.

    Also, as an example, Japan or China, yeah, sure, you can’t buy weed; they will basically curb-stop you legally. But you can drink as much alcohol as you want, smoke as much tobacco as you want, and drink as many caffeine drinks as you want. These are all recreational drugs with a much higher impact on society than weed, yet they are totally legal and accepted by everyone or are even traditional.