nakedfrog
No Lifer
- Apr 3, 2001
- 62,807
- 19,009
- 136
Most of the layoffs – about 12,000 – will affect white-collar professionals as Nestlé targets “operational efficiency,” including by automating processes and using shared services, the company behind such brands as KitKat and Nesquik said in a statement.
Nestlé says on its website that it uses AI in a number of functions, including research and development. In its last annual report, it also said it employs automation and advanced analytics in promotional activities, such as work on discounts and in-store displays.
Nvidia at the centre of the circle jerk
View attachment 131642![]()
Nvidia to Finance Musk’s xAI Chips as Part of $20 Billion Deal
Elon Musk’s artificial intelligence startup xAI is raising more financing than initially planned, tapping backers including Nvidia Corp. to lift its ongoing funding round to $20 billion, according to people with knowledge of the matter.www.bloomberg.com
In a blog post published today, Marshall Miller, the foundation’s senior director of product, said Wikipedia’s human visits are down about 8% over the past few months compared to the same period in 2024.
AI Is Killing Wikipedia’s Human Traffic:
![]()
AI Is Killing Wikipedia’s Human Traffic
Wikipedia’s human traffic is declining as more people rely on AI tools for answers.gizmodo.com
![]()
New User Trends on Wikipedia
Isiwal, CC BY-SA 4.0 via Wikimedia Commons In March, the Wikimedia Foundation shared about the global trends that are impacting our movement. These trends have continued to shape not only the Wikim…diff.wikimedia.org
I agree with you that there is a hype bubble surrounding AI. It's also true though that it's not going away. Inside that bubble is a core of molten iron, I don't know how big it is compared to the entire bubble but you aren't dislodging it at this point.I'm generally of the opinion that AI in its current form (aside from some very niche applications) is going to fail hard, I'm in agreement with the market suspicion that it's a bubble that's going to burst (and IMO that burst is a good thing because the entire premise was flawed), however I'm increasingly of the opinion that those in the market driving for this sort of AI are doing so on a premise that makes sense: People want easy* answers, and AI can deliver on that.
* - please note I just used the word "easy", this does not imply a trifecta of say: easy, quick and accurate.
There's a quote from Incredibles 2 that I think is very appropriate:
"People want ease. People will trade quality for ease every time. It may be crap, but it's convenient!"
The below is my opinion too
"People want ease. People will trade quality for ease every time. It may be crap, but it's convenient!"
AI to verify a picture a human is taking of another human at the DMV
AI to verify a picture
OpenAI has other theoretical revenue streams, like charging a commission for items purchased through ChatGPT e-commerce integrations and ads – something CEO Sam Altman initially dismissed but now is considering.
Automated e-commerce could be a thing, someday. Stranger things have happened. Ad support, however, has proven difficult for rival Perplexity, which recently paused accepting new advertisers to rethink its revenue plan.
OpenAI's platforms account for about 80 percent of all web traffic for generative AI tools, representing 190 million of the 240 million average daily visits, according to SimilarWeb data [PDF] published in May.
OpenAI's path to profitability is easier said than done – make a product so compelling that people will pay for it. We're not there yet. ®
Watch out for the Meta Ban Wave:
![]()
Facebook and Instagram users complain of account bans
After Meta said some Facebook Groups were wrongly suspended, users tell the BBC the impact it is having - and say it's a wider problem.www.bbc.com
![]()
Meta admits wrongly suspending Facebook Groups
Group administrators have reported receiving automated messages which incorrectly say they have violated the rules.www.bbc.com
![]()
![]()
Angry, confused and worried about police – behind Instagram bans
Hundreds of people have contacted the BBC to say they have wrongly been shut out of their accounts.www.bbc.com
Facebook started a ban for various good reasons over summer (harmful content etc.), but then businesses & non-offenders also got perma-banned...people with 15+ year old accounts, Instagram accounts, Quest headset accounts & all associated purposes, small biz pages, etc. For the past few weeks, there's been an appeal loop where people can NEVER get back in. It seems to be a mix of an overzealous AI auto-moderator, a faulty Meta update, and the AWS downtime that has gotten worse. There are rumors that millions of accounts are just completely gone:
Legally solve problems with Instagram and Facebook
Solving problems with your Instagram or Facebook account. We help you legally restore a disabled account, regain access to a hacked account, fix problems with advertising on Instagram or Facebook.social-me.co.uk
It's easy to brush it of as "good, leave Facebook!"), but deplatforming can be HUGELY isolating for people, especially disabled people, as well as business-reducing for so many small businesses. I have friends who rely on FB & IG for selling their services like baking, nails, car detailing, etc. Cambodia in particular has so many bans that the government had to get involved due to the account-restore scams going on:
This is the future, right now, today:
1. Your whole life & business engrained on social media
2. AI ban with no human intervention
3. ¯\_(ツ)_/¯
View attachment 132326
This summer, my Facebook account was permanently “disabled.”
I had been helping another mother navigate a medical decision for her child, a young adult who is living with a degenerative condition. It wasn’t medical advice. It was empathy, drawn from my own lived experience as a medical mother, a certified coach, and years of teaching courageous communication at a medical school.
Meta AI had flagged the conversation as a violation of Community Standards on child abuse. My words, marked by an algorithm that couldn’t distinguish exploitation from support. My account, and years of advocacy, caregiving, and connection, disappeared overnight.
It took six weeks, multiple appeals, and a friend-of-a-friend inside the company to reach a human being who confirmed it had been a mistake.
By the time my account had been restored, something in me had shifted.
Algorithms don’t understand context or care.
For families like mine (parents navigating rare diseases, disability, or chronic illness) online communities have become lifelines. This is where we go when the rest of the world sleeps. I can post in a support group in the middle of the night, and find another parent across the country, or across the world, who understands. Someone who doesn’t need the backstory to respond with compassion. Peer-to-peer communities, often hosted on platforms like Facebook and Instagram, have quietly become part of our public health infrastructure. They lower isolation, reduce caregiver stress, and increase engagement with care plans. These same spaces are now being monitored by algorithms that flag “dangerous content.” When an AI system can’t tell the difference between misinformation and a parent sharing fear or uncertainty, it can silence the very support families depend on. When language about fear, prognosis, or end-of-life care is automatically deemed suspect, taken out of context and out of community, we risk losing the capacity to talk about the hardest parts of medicine at all.
When we ban words, we lose people.
There’s a quiet irony here. Medicine already struggles with language: the words we avoid, the silences that form around suffering, disability, and uncertainty.
Now, those silences are being automated.
If algorithms start deciding which stories are safe to tell, we risk losing the spaces where caregivers and families process what cannot be fixed.
These are not peripheral conversations. They are central to healing.
Clinicians need to care about where families are finding support and how those spaces are being shaped. Because if families can’t talk about what scares them online, in spaces built for comfort and connection, they may stop talking about it altogether. Especially in the clinic.
Engagement is everything
Doctors worry about misinformation online. Rightly so. Social media is rife with false expertise and outright fabrication. But the solution isn’t censorship. It’s engagement. Families rarely turn to Facebook because they distrust their doctors. They join because they need to be heard. They want someone to stay with them in the unknowing. When medicine steps out of the conversation, we leave room for fear to grow in silence.
Health care professionals need to be aware of and engaged in these digital spaces. To model, not to monitor. To partner in what respectful, evidence-informed, compassionate dialogue can look like. Doctors, nurses, allied health professionals, and educators can play a vital role in fostering healthy peer-to-peer support networks. The same empathy brought to the bedside can be extended to the comment thread.
Connection cannot be automated. Listening cannot be outsourced.
We need to build softer communities.
As a mother-scholar, I live in the dual worlds of clinical education and clinical navigation. Through my courses and workshops, I remind health care professionals that engagement is not a distraction from professionalism. It’s part of it. I’m rebuilding those softer spaces through my Substack, The Soft Bulletin, and through my coaching work with clinicians and caregivers. I’m not leaving connection behind. I’m rebuilding it. What I want, and what I believe many clinicians and caregivers want, is a softer kind of community. One that values curiosity over compliance, listening over labeling, and conversation over control.
As health care professionals, we must ask: Where are our patients finding connection? What happens when the algorithms that shape those spaces decide their words are dangerous?
If we want to protect mental health, trust, and humanity in medicine, we have to keep talking.
Because engagement is everything. Healing doesn’t happen in isolation. It happens in dialogue. Even, and especially, the hard conversations.