AI-generated illustration: Semiconductor Chip Technology
Image generated with Pollinations.ai
Weekly Briefing 6 min read

AI Weekly - Week 52/2025

Sunday, December 28, 2025

This article was researched and written with AI

TL;DR

This week in 30 seconds:

  • RAM Crisis: AI is devouring memory chips – 50% price increase, shortage until 2027
  • Red Lines: Microsoft refuses “runaway” superintelligence – Suleyman sets clear boundaries
  • Mega Deal: Nvidia licenses Groq for $20 billion – focus shifts to inference

Story of the Week

AI is Eating the Memory Market – And Consumers Pay the Price

The AI revolution has an unexpected casualty: memory chips are becoming a luxury good. [1] Micron, one of the world’s largest RAM manufacturers, reported record earnings this week – along with troubling news: demand significantly exceeds supply. [1]

The problem is structural. AI datacenters are consuming massive amounts of High-Bandwidth Memory (HBM) to feed their GPUs. [1] What means champagne for Nvidia shareholders translates to more expensive laptops, smartphones, and gaming consoles for consumers.

The numbers are stark: DRAM prices rose 50% this quarter. [1] Rush orders cost manufacturers double or triple. And analysts project another 40% increase next quarter. [1]

“Aggregate industry supply will remain substantially short of the demand for the foreseeable future.” - Sanjay Mehrotra, Micron CEO [1]

Dell COO Jeff Clarke has already warned that these costs will be passed on to customers. [1] Tech analyst Avril Wu recommends consumers buy now if they need new devices. [1]

The bitter irony: Micron has even shut down its consumer SSD and RAM division – to free up capacity for more lucrative AI chips. [1] Relief isn’t expected until 2027, when new manufacturing facilities come online. [1]

Bottom Line: The AI boom comes at a price, and consumers will feel it on their next hardware purchase.


More Top Stories

Bad Blood Author Takes on AI Industry

John Carreyrou, the investigative journalist behind the Theranos scandal, has a new target: the entire AI industry. [2] This week, a group of authors under his leadership filed suit against OpenAI, Anthropic, Google, Meta, xAI, and Perplexity. [2]

The accusation: The companies trained their models on pirated copies of copyrighted books – from sources like LibGen, Z-Library, and OceanofPDF. [2] The plaintiffs are demanding $150,000 per work per defendant, potentially $900,000 per book. [2]

For comparison: The previous Anthropic settlement offered only about $3,000 per author – just 2% of the statutory maximum. [2] Carreyrou and company want more.


Microsoft Pulls the Superintelligence Emergency Brake

While OpenAI and others race toward AGI, Microsoft is taking a different path. [3] AI chief Mustafa Suleyman announced the creation of the “MAI Superintelligence Team” – with a clear philosophy: “Humanist Superintelligence” instead of unlimited capabilities. [3]

The crucial difference? Microsoft wants specialized systems for medicine and energy rather than an infinitely capable generalist. [3] Suleyman justifies this with controllability: broadly capable superintelligence cannot be adequately governed. [3]

“We won’t continue to develop a system that has the potential to run away from us.” - Mustafa Suleyman, Microsoft AI CEO [3]

His “red lines”: Containment and alignment are non-negotiable. No system goes live that doesn’t meet these criteria. [3] In an industry optimizing for maximum capability, that’s a remarkable statement.


Nvidia Buys Inference Dominance

On Christmas Eve, Nvidia announced the largest deal in company history: $20 billion for a non-exclusive license to Groq’s technology – plus acquisition of the founding team. [4]

What Nvidia gets: Jonathan Ross, who co-invented the TPU at Google, and access to Groq’s Language Processing Units (LPUs). [4] These chips are optimized for inference – not training – and promise 10x higher speed at one-tenth the energy consumption. [4]

The message is clear: Training is largely solved, now it’s about efficient inference. [4] Groq had raised $750 million at a $6.9B valuation just in September. [4] That Nvidia is paying triple shows how hot the inference market is getting.


Quick Hits

  • OpenAI Seeks Preparedness Chief: After reassigning Aleksander Madry in July, OpenAI is again looking for an AI safety executive. [6] Details
  • Waymo Tests Gemini in Robotaxis: Passengers can ask the AI assistant questions and control cabin features. TechCrunch
  • Alphabet Acquires Intersect Power: $4.75 billion for dedicated renewable energy – bypassing grid bottlenecks. TechCrunch
  • ChatGPT Wrapped: Spotify-style year-end review for ChatGPT users with personalized stats. Launch
  • Anthropic Skills as Open Standard: Workflow definitions now portable between tools like Notion, Figma, and Atlassian. AI Business
  • Disney + Sora: 200+ Disney characters coming to OpenAI’s video generator – short films on Disney+. OpenAI

Tool of the Week

Anthropic Skills - Reusable Workflow Definitions for Claude

Anthropic has made “Skills” an open standard. Skills are instruction sets that teach Claude specific workflows, standards, and domain knowledge – and are now portable between different platforms.

The integration with Notion, Figma, Canva, and Atlassian makes this particularly interesting for teams: Once-defined processes work the same everywhere.

Who should use it: Anyone looking to standardize repetitive AI workflows. More Info


Fail of the Week

“Slop” is Word of the Year – And It’s Deserved

Merriam-Webster has spoken: The word of 2025 is “Slop” – defined as low-quality digital content usually produced in quantity by artificial intelligence. [5]

The word’s history is fitting: In the 18th century it meant “soft mud,” then “pig feed,” eventually “garbage.” [5] Now it stands for everything from bizarre AI videos to fake news to what Merriam-Webster calls “workslop” – AI-generated reports that waste coworkers’ time. [5]

The implicit message to the AI industry: For all the superintelligence talk – sometimes you just don’t seem that intelligent.


Number of the Week

$20 Billion

Nvidia’s largest deal ever – for Groq’s inference technology [4]

For comparison: The previous record was Mellanox in 2019 for $6.9 billion. This deal is nearly 3x larger. [4] The message: Whoever dominates AI inference dominates the next phase of the AI revolution.


Reading List

For the weekend:

  1. AI is Eating the Memory Market - Why your next devices will cost more (5 min)
  2. Microsoft’s Humanist Superintelligence - Suleyman’s alternative to the AGI race (8 min)
  3. Why “Slop” Became Word of the Year - The cultural significance of AI garbage (4 min)

Next Week

What’s coming:

  • CES 2025 kicks off January 7 in Las Vegas – expect AI hardware announcements
  • New OpenAI Releases? After GPT-5.2 and Image 1.5, more could be coming
  • Anthropic + Accenture Details on training 30,000 professionals

This newsletter was researched and written with AI assistance. Hero image generated with Pollinations.ai.