Hello, Human Guide
Today, we will talk about these THREE stories:
OpenAI’s rumored cheaper, smarter model tier
NVIDIA’s Blackwell chips selling out into 2026
Apple’s slow, privacy first AI rollout
The Lithium Boom is Heating Up
Thanks to growing demand, lithium stock prices grew 2X+ from June 2025 to January 2026. $ALB climbed as high as 227%. $LAC hit 151%. $SQM, 159%.
This $1B unicorn’s patented technology can recover 3X more lithium than traditional methods. That’s earned investment from leaders like General Motors.
Now they’re preparing for commercial production just as experts project 5X demand growth by 2040. They’ve announced what could be one of the US’ largest lithium production facilities and have rights to approximately 150,000 lithium-rich acres across North and South America.
Unlike public stocks, you can buy private EnergyX shares alongside 40,000+ other investors. Invest for $11/share by the 2/26 deadline.
This is a paid advertisement for EnergyX Regulation A offering. Please read the offering circular at invest.energyx.com. Under Regulation A, a company may change its share price by up to 20% without requalifying the offering with the Securities and Exchange Commission.
OpenAI Is Slashing Prices And It’s Not Generosity

The AI price war just escalated again.
Multiple reports from The Information and The Verge suggest OpenAI is preparing a new model tier optimized for lower inference costs while maintaining GPT 4 class performance. The company already cut API prices by up to 50% in 2024, and developers report inference costs dropping more than 80% since early GPT 4 releases as competition from Anthropic and Google intensifies.
What stands out is that this is less about generosity and more about land grab. When model costs fall this fast, you can almost hear startup founders refreshing their AWS dashboards at midnight, watching margins widen. The real shift is not capability it is affordability, and that pulls entire industries into automation faster than they planned.
If cheaper frontier models become the default, smaller startups can suddenly ship features that used to require venture scale burn. The moat moves from model access to distribution and data.
If intelligence keeps getting cheaper every quarter, the real question is who captures the value when everyone can afford it.
NVIDIA’s Blackwell Is Sold Out Into 2026

The chips are gone before they even arrive.
NVIDIA’s next generation Blackwell GPUs are reportedly sold out through much of 2026, according to supply chain checks cited by Reuters and Bloomberg. Hyperscalers like Microsoft, Amazon, and Google are locking in capacity early, after NVIDIA posted $22.1 billion in quarterly data center revenue last year, up 265% year over year at peak demand cycles.
What struck me is how this feels like airlines pre booking fuel during a war. Cloud providers are not waiting to see if AI demand sticks they are committing billions upfront, while fluorescent lit data centers hum 24 7. When supply gets locked years ahead, experimentation becomes a luxury only the biggest players can afford.
This bottleneck quietly reshapes competition. If compute is pre sold, startups must rely on optimized small models, partnerships, or secondary markets.
If the hardware is spoken for two years out, the real question is whether innovation accelerates or consolidates around whoever reserved the racks first.
Apple’s Private AI Strategy Is Falling Behind

Apple wants AI without the cloud.
At WWDC, Apple unveiled Apple Intelligence, a privacy first AI system designed to run partly on device across iPhone and Mac. But early developer feedback suggests rollout delays, limited API access, and narrower capabilities compared to cloud native systems from OpenAI and Google, according to reporting from TechCrunch and The Wall Street Journal.
I think this is less about capability and more about philosophy. Apple is betting that privacy and tight hardware integration will matter more than raw model power, even if that means slower feature velocity. Late at night, when developers test SDKs on glowing MacBook screens, speed still wins attention and Apple is deliberately choosing restraint.
The tradeoff is clear control versus scale. If Apple waits too long, developers may build elsewhere first and optimize later.
If privacy first AI ships slower but safer, the real question is whether users will reward patience or chase power.



