Do More Newsletter

This issue contains featured article "AI vs. Your Power Bill: What’s Really Driving U.S. Electricity Prices", exciting product information about Hero SDK, DeepL Agent, Akkio, YouWare, and Baidu. Keep up to date on the latest products, workflows, apps and models so that you can excel at your work. Curated by Duet.

In partnership with

Stay ahead with the most recent breakthroughs—here’s what’s new and making waves in AI-powered productivity:

Hero SDK Completes Your Prompts

Hero, a rising productivity app, just announced the launch of its new SDK that leverages advanced AI to autocomplete prompts for scheduling and daily planning. By integrating natural language suggestions, Hero helps users seamlessly find time for meetings or even social engagements. Designed to reduce manual typing and time wastage, its prompt-completion speeds up calendar interactions and keeps schedules fluid—ideal for professionals juggling multiple commitments.

DeepL Agent: Autonomous Language Productivity

DeepL's newly available Agent is transforming productivity for knowledge workers. Acting as an AI coworker, DeepL Agent can automate repetitive tasks and centralize routine language workflows, minimizing manual reviews. Paired with their Customization Hub, businesses can now streamline translations, project completions, and multilingual communications across over 100 languages. DeepL's latest update makes collaboration and automation easier than ever for global teams.

Akkio: Automated Analytics for Creators

The growing demand for actionable insights led to Akkio's new analytics dashboard, which automatically transforms ordinary spreadsheets into dynamic business forecasts. Akkio employs predictive analytics to quickly surface campaign trends and revenue predictions—no coding required. By automating data visualization, it enables marketers, creators, and small businesses to be data-driven without technical hurdles.

YouWare: Creator platform milestone & expansion

YouWare, an AI creation platform focused on community-driven content and creator tooling, announced a major valuation and expansion milestone this week and signaled upgrades to its creator toolset and marketplace features to scale collaboration between creators and brands.

Baidu: GenFlow, Oreate, MeDo, digital-human upgrades

Baidu unveiled ERNIE 5.0, described as an omni-modal foundation model, alongside a slate of productivity and creator products (GenFlow agent framework, the Oreate one-stop AI workspace, Miaoda no-code builder, and upgrades to its digital-human tech). The package is framed as both a capability and a platform push for enterprises and creators, emphasizing multi-modal understanding (text, image, audio, video) and agentic workflows.

DeepL Agent—The AI Coworker Redefining Productivity

Inside DeepL Agent’s Innovation

With the exponential growth of remote work and international projects, the volume of repetitive language tasks—like translating, summarizing, or drafting communications—has surged. DeepL’s Agent steps in as a powerful autonomous tool, capable of handling more than just basic translation. By learning team workflows, it automates context-switching across apps and websites, centralizing all core tasks—a major boost for anyone toggling between systems up to 1,200 times per day.

What sets DeepL apart is its blend of automation and human-in-the-loop expertise, which enhances accuracy while letting teams focus on creative, high-impact tasks. The Customization Hub cements its role as an indispensable part of the modern productivity stack, offering secure, scalable language management that is a true differentiator in the AI market.​

AI vs. Your Power Bill: What’s Really Driving U.S. Electricity Prices

Let’s just get right to it: yes, AI is already moving U.S. power prices—noticeably in places that host concentrated data‑center build‑outs—and it’s poised to matter a lot more by 2030. But “AI did it” is too easy. Fuel costs, weather, transmission bottlenecks, and generic load growth still set the table; AI’s surge is the new heavyweight sitting down to eat. The result is a visible uptick in wholesale prices and capacity charges around data‑center hubs (Northern Virginia, parts of PJM, Oregon, fast‑growing pockets of Texas), while other regions with abundant new generation (notably solar in ERCOT) can mute or offset the effect.

The demand shock in plain English

What changed isn’t that computing suddenly uses electricity; it’s the pace and density of demand from AI training and inference. The U.S. Energy Information Administration (EIA) now treats “computing” as a distinct end‑use because it’s growing faster than anything else in commercial buildings. In 2024, computing already made up an estimated 8% of commercial sector electricity consumption, and its share is projected to more than double by mid‑century. AI is a big reason the demand curve has stopped being flat.

Zoom out globally and the International Energy Agency’s (IEA) 2025 outlook is blunt: data‑center electricity use is set to double by 2030, reaching roughly 945 TWh, with average demand growth of ~15% per year—four times the rate for the rest of the economy. That’s the macro backdrop into which U.S. price dynamics fit.

The U.S. numbers are already big. Best available estimates put U.S. data‑center consumption at ~183 TWh in 2024just over 4% of all U.S. electricity—and rising to ~426 TWh by 2030 in baseline scenarios. DOE’s congressionally‑mandated study likewise found data centers at ~4.4% of U.S. electricity in 2023, with scenarios that reach 6.7%–12% by 2028 depending on how aggressively AI scales. Even the low end is a historic load addition for a mature grid.

Two details make the AI wave uniquely grid‑relevant:

  • Extreme power density: A generative‑AI training cluster can pull 7–8× more energy than a typical compute workload. You’re not adding “a bit of IT”; you’re adding an industrial‑scale load that stresses substations, feeders, and cooling systems.

  • Usage that doesn’t stop at training: Once models are deployed, inference (people and apps hitting the models) becomes the metronome. Alphabet’s chair, John Hennessy, put a fine point on cost intensity when he said an AI chat exchange likely costs ~10× a standard keyword search, which squares with what utilities are seeing when inference traffic ramps.

If you want one emblematic training number: independent academic estimates peg GPT‑3’s training at ~1,287 MWh—enough electricity to power over a hundred U.S. homes for a year. Newer foundation models dwarf GPT‑3 in parameter count and training runs, but the companies haven’t published apples‑to‑apples energy disclosures. The safe conclusion is that training is lumpy but increasingly overshadowed by sustained inference demand as usage explodes.

Where the load is landing (and why prices react there first)

The impact is most visible where data centers cluster and interconnect rapidly:

  • Northern Virginia (Dominion Energy Virginia)—the epicenter of “Data Center Alley.” Dominion’s 2024 annual report says data centers account for about 26% of its total electric load, up from ~21%–24% the prior two years. Virginia as a whole is now the top net importer of power among U.S. states—36% of its electricity supply in 2023 came from other states—because in‑state demand, heavily shaped by data centers, has outrun local generation and transmission capacity. That import reliance and the sheer volume of new large loads are key price drivers.

  • PJM (mid‑Atlantic and Midwest)—where capacity charges are spiking. The grid’s independent market monitor and multiple consumer filings link record capacity auction outcomes to load growth “chiefly from data centers.” The price translation isn’t abstract: Baltimore‑area bills rose by more than $17/month after a record auction—and another $4/month is teed up in mid‑2026. One synthesis pegs the PJM‑wide capacity cost increase at $9.3 billion for the year starting June 2025. That’s not “maybe”; it’s on people’s bills.

  • Oregon (Portland/Hillsboro)—numerous major data centers are concentrated in a tight corridor; lawmakers passed the POWER Act to rebalance how large loads share costs. The principle spreading through commissions: if you’re creating the load spike, you help fund the upgrades.

  • Texas (ERCOT)—the paradox case. On one hand, ERCOT documents show new data centers are the major area of load growth, and interconnection requests for large loads ballooned from ~56 GW (Sep 2024) to ~200 GW+ in 2025, far above current peak demand. On the other hand, massive solar build‑out (plus wind and batteries) has depressed average wholesale prices at times, so the AI effect is uneven: more visible in local congestion, peak hours, and capacity provisioning than in annual price averages. Translation: Texas is absorbing a ton of new load, but price outcomes depend on when and where you look.

The price mechanisms: three ways AI shows up on your bill

1) Wholesale price pressure near hubs.
A Bloomberg analysis of tens of thousands of grid pricing nodes shows wholesale electricity costs as much as 267% more than five years ago in areas within ~50 miles of major data‑center activity. Price spikes are concentrated around data hubs like Baltimore’s tie to Northern Virginia. Since retail tariffs pass through commodity costs over time, nearby households feel it.

2) Capacity market charges.
Regions like PJM procure “capacity” years ahead to ensure enough generation at peak. When a wave of firm, non‑interruptible demand arrives (data centers), the market clears at higher prices. Result: bill adders of $16–$18/month in specific zones (e.g., western Maryland, parts of Ohio) and multi‑billion‑dollar cost increments grid‑wide. You can argue about design choices, but you can’t argue it isn’t real money.

3) Grid upgrades and transmission.
These clusters need new substations, transformers, high‑voltage lines. Absent special tariffs, costs get “socialized” across customer classes. PJM approved $5.9 billion in new transmission in early 2025 citing load growth primarily from data centers; commissions from Oregon to Virginia are now debating how much of that should be paid by the hyperscalers versus everyone else.

Now, reality check: AI isn’t the only price driver. EIA’s 2025 outlook explicitly calls out natural gas prices and regional generation mix alongside AI/data‑center load. In California and the Southwest, for example, wholesale prices were projected to jump 30%–35% year‑over‑year going into 2025—AI is part of the story, but fuel and weather matter, too. Meanwhile Texas was an exception, with lower average wholesale prices on the back of a solar surge even as new data centers queue up. This is why you can’t just blame AI for every bill increase.

Training vs. inference: what the grid actually “sees”

Training is lumpy: when a model is being trained, big clusters run flat‑out for days or weeks—high, concentrated power draw. That’s when the “7–8× typical compute energy” effect matters. Inference, by contrast, is continuous and scales with adoption; as generative AI gets embedded into search, office suites, coding tools, and customer service, the “always on” traffic dominates total energy use. The per‑query gap is why industry insiders warn about unit‑economics: an AI chat can cost an order of magnitude more than a keyword search. As product teams wire AI into everything, utilities see not one spike, but a rising baseline.

On absolute numbers, transparent disclosures are still rare. The last widely cited academic estimate for a frontier model—GPT‑3 at ~1,287 MWh for the training run—gives a sense of scale, but it’s already a small model by 2025 standards. The meaningful policy point isn’t the exact tally on any one run; it’s that inference at mass adoption beats training in cumulative energy and that growth curve is steep.

International signals (read: where U.S. hubs might be heading)

Want a cautionary data point? Ireland. As of 2023, data centers there used 21% of all metered electricity, more than all urban households combined, forcing a pause on new Dublin connections and intense debates over who pays for system upgrades. The U.S. isn’t Ireland, but Northern Virginia is showing similar “concentration” math: when a small area takes a massive share of national compute, local infrastructure and prices feel it first.

The other side of the ledger: forces that lower or contain prices

If you’re tempted to treat AI as a one‑way ratchet on bills, hold up. There are counterweights:

  • Rapid supply build‑out in renewables and storage. ERCOT’s solar and battery boom has delivered long stretches of low (even negative) wholesale prices despite surging large‑load requests. That doesn’t mean no pain—peaks and congestion still bite—but it shows how quickly supply can blunt average prices in competitive markets.

  • Efficiency gains inside the data centers. Google/DeepMind’s controls cut cooling energy use ~30%—a notable wedge because cooling can be a large chunk of non‑IT load. Hyperscalers have slashed PUEs and keep iterating on chips, power capping, and model optimization (quantization, sparsity), all of which reduce watts per inference. These don’t eliminate the total‑load effect, but they stretch infrastructure further per dollar invested.

  • Flexible load/demand response. This is new: companies are starting to curtail AI compute in peak hours under utility agreements to relieve stress and lower system costs. Google, for example, signed curtailment deals with two U.S. utilities in 2025. If regulators scale this—especially for training workloads that are timing‑flexible—it reduces the need for expensive standby capacity that shows up as bill adders.

  • On‑site or contracted generation. Expect more data centers to pair with dedicated generation—gas turbines, utility‑scale renewables, even SMR‑class nuclear in filings—so the “who pays for upgrades” fight isn’t entirely socialized. Dominion’s 2025 request for small modular reactor proposals shows how far utilities are now thinking to meet sustained AI‑driven load growth in Virginia.

These mitigations won’t erase price impacts in hot zones, but they change the slope—and, critically, who pays.

What the “main movers” are doing to U.S. prices—company by company (without the PR gloss)

Cloud giants (Amazon, Microsoft, Google) are the demand engine. Their capex signals say the quiet part out loud: hundreds of billions to build hyperscale capacity and specialized AI clusters. In 2024 alone, the big three spent $200+ billion (much of it for data centers), and 2025 headline deals continue at gigawatt scale. When those interconnection requests hit a regional queue all at once, local capacity prices jump. The PJM capacity market is the cleanest public window into that effect.

Chip vendors (NVIDIA et al.) aren’t the ones filing interconnection requests, but each successive generation pushes up server rack power density. That cascades into cooling retrofits and substation upgrades—capital that utilities recover through rates unless commissions ring‑fence costs.

Hyperscalers’ energy teams are also the only reason this hasn’t been worse. They’ve pushed PUEs toward ~1.1–1.2, stacked long‑dated renewable PPAs, and (increasingly) signed up for peak‑hour curtailment. Those steps buffer the grid, even if they don’t neutralize the aggregate load.

What to watch between now and 2030

  • Baseline demand keeps climbing. EIA expects U.S. electricity consumption to set new records in 2025 and 2026, reversing a decade of flat demand. That makes it easier for any incremental load to move prices.

  • Global data‑center electricity doubles by 2030. The U.S. takes the largest slice, so pricing effects here will be ahead of the global average.

  • U.S. AI/data‑center load roughly doubles this decade. Reasonable base cases point to ~426 TWh by 2030 (from ~183 TWh in 2024). In service territories like Dominion’s, data centers are already a quarter of total load and rising. Concentration, not just growth, is the risk.

  • PJM as the bellwether. If subsequent capacity auctions keep clearing high—and the market monitor is already warning of a “maximum price for the foreseeable future”—expect more $10–$20/month bill adders in hard‑hit zones unless large‑load customers fund more of their own infrastructure.

  • ERCOT as the stress test. Interconnection requests in the hundreds of gigawatts won’t all build, but even a fraction will remake local grids. Watch whether Texas keeps average prices in check via renewables and storage while still grappling with local congestion and peak pricing where data‑center clusters connect.

  • Policy and rate design. We’re already seeing states pass tools like Oregon’s POWER Act; expect more “special contracts,” interruptible‑load tariffs, and demand‑response mandates targeted at training workloads. The meta‑question is simple: Who pays for the boom?

What you might be getting wrong

“AI is the main reason my bill went up.”
Sometimes. In Baltimore and chunks of PJM, the link is direct via capacity charges and wholesale prices near data hubs. But in California and the Southwest, 2025’s price hikes were also tied to fuel and weather; in Texas, average wholesale prices fell with a solar wave even as data‑center queues exploded. If you don’t separate commodity costs, capacity charges, and wires charges, you’re arguing with a blur.

“We’ll solve it with efficiency.”
We’ll blunt it, not erase it. A 40% cut in cooling energy is real and worth chasing; AI‑assisted operations will keep squeezing the denominator (kWh per unit of compute). But efficiency gains have a habit of being out‑run by new uses (Jevons‑style) when the product is cheap and popular. The only durable fix is more flexible demand + more generation + more transmission.

“Just ban new data centers.”
Enjoy exporting the problem to your neighbor and importing their electrons. Virginia already imports over a third of its power; bans shift build‑out (and tax base) across borders and push up congestion costs. Better: price the externalities correctly (interconnection deposits, cost‑sharing for upgrades, mandatory demand response) and site intelligently near new generation.

Our energy-consuming AI future

AI is no longer a rounding error in U.S. electricity demand. It’s a primary source of new load in multiple regions, and in places with concentrated build‑outs it is already adding dollars to bills—via wholesale prices, capacity markets, and grid‑upgrade pass‑throughs. The clearest evidence: PJM’s bill adders and wholesale jumps near data hubs. At the same time, regions awash in new generation show you can accommodate a lot of AI without breaking average prices—if supply and siting keep pace. The 2030 endpoint is not mysterious: global data‑center electricity roughly doubles, U.S. consumption roughly doubles, and pricing will hinge on whether we add enough steel in the ground (wires and plants) and enough flexibility (curtailment and storage) to keep the system from paying peak‑hour scarcity rents. If we don’t, expect more $10–$20/month adders in data‑center hot spots and a politicized blame game over who shoulders them. If we do, AI still raises the floor on load but doesn’t have to raise the ceiling on prices everywhere.

Partner Spotlight: Duet Display

Duet Display transforms your devices—Mac, PC, iPad, iPhone, or Android—into a lightning-fast second monitor, a Remote Desktop client, and even a graphics tablet for digital illustration. 

Discover all the newest features at Duet Display.

Find customers on Roku this holiday season

Now through the end of the year is prime streaming time on Roku, with viewers spending 3.5 hours each day streaming content and shopping online. Roku Ads Manager simplifies campaign setup, lets you segment audiences, and provides real-time reporting. And, you can test creative variants and run shoppable ads to drive purchases directly on-screen.

Bonus: we’re gifting you $5K in ad credits when you spend your first $5K on Roku Ads Manager. Just sign up and use code GET5K. Terms apply.

Stay productive, stay curious—see you next week with more AI breakthroughs!