Page 18 of 101

For Microsoft, will Trump’s antitrust and environmental views help or harm?

I recently wrote about how President-elect Donald J. Trump’s actions on AI might affect Microsoft. This week, I’m focused on what his antitrust regulation and environmental plans — and the biggest wildcard of all, his personal vendettas — could do to the company. 

What Microsoft can expect from antitrust lawsuits

Trump believes that the less regulation on big business, the better. So you would expect him to put an end to antitrust suits against the tech industry. But that’s not necessarily the case.

There’s no doubt that Lina Khan, the head of the US Federal Trade Commission (FTC) who has aggressively pursued antitrust prosecutions against tech, will be let go after Trump’s election. And many of Trump’s advisers, notably venture capitalist Marc Andreessen, would like to see tech antitrust prosecutions to stop. 

However, some advisers close to Trump, including Vice President-elect JD Vance, want the administration to take on Big Tech — mainly because they want to stop Meta and other social media companies from policing against misinformation, white supremacism, public-health health deceptions and election lies.

Microsoft has largely been spared Khan’s prosecutions, even as the Biden administration has targeted Google, Apple, Meta, Amazon, and Apple. The one recent federal antitrust action against Microsoft by the FTC, for buying the gaming giant Activision, didn’t go well for the feds. A judge let the purchase go through, although the FTC has since appealed the case.

That might make you think that Microsoft is in the clear under Trump. But The Washington Post reports the FTC will be investigating Microsoft’s cloud business for anticompetitive practices. In addition, the FTC appeal of the Activision case still stands, so that case could be revived.

Trump could demand that whomever he appoints to head the FTC drop those actions. Odds are, he won’t, thanks to his main tech adviser, entrepreneur Elon Musk. His AI startup, xAI, competes directly with Microsoft, and is now valued at $50 billion after investments this spring from Andreesen and others. Musk also recently amended an antitrust suit he filed against OpenAI, adding Microsoft as a defendant

Don’t be surprised if the FTC under Trump not only follows through on Khan’s investigations of Microsoft, but also files an AI suit against the company, thanks to Musk’s influence.

Trump, Microsoft, and climate change

Trump believes climate change is a hoax. He’s vowed to tear up environmental regulations and attack green energy. His campaign slogan, “Drill, Baby, Drill,” and his close friendship with the oil industry make clear that he’ll do everything he can to increase reliance on fossil fuels and kill clean sources of electricity.  

He was also a booster of nuclear power during his first administration, though he wasn’t quite as enthusiastic about it on the campaign trail. Even so, the stock market price of nuclear-power-related companies jumped the day after his election, and most people expect him to be a nuclear backer.

What does this have to do with Microsoft? Plenty. Microsoft has vowed to make itself carbon-negative by 2030, and Trump’s attack on green energy will make it more difficult for the company to find clean energy sources.

Exacerbating Microsoft’s climate-change challenges is the fact that data centers that power AI require a tremendous amount of electricity. As I’ve noted before, Microsoft might be abandoning its promises to fight climate change because of that. And the company could also pour billions into reviving nuclear energy with a proposed deal to reopen Three Mile Island, the site of the worst nuclear power disaster in US history. 

Given Trump’s views about climate change and his support for AI, he’ll most likely do everything he can to give Microsoft and other AI companies all the electricity they want no matter the effect on the environment. And he’ll also likely let them go full speed ahead with nuclear power. In fact, Microsoft President Brad Smith recently said he expects Trump to cut environmental regulations to provide Microsoft with all the electricity it wants for its AI data centers

Gregory Allen, director of the Wadhwani AI Center at the Center for Strategic and International Studies — he worked on AI at issues the Department of Defense during the Trump and Biden presidencies — agrees. On a call hosted by The Information, he said Trump “can invoke emergency powers and waive a lot of environmental regulations to allow people to build new nuclear and other electrical generation capacity in order to power the big data centers that folks want for these advanced AI models.”

He added that he expects that to happen “pretty early in the Trump Administration.”

Trump’s vendettas and grievances

The president-elect is driven by vendettas and grievances more than he is by policy. And when it comes to tech, he has plenty of them.

In the 2020 election, Meta founder Mark Zuckerberg and his wife started a foundation “to ensure that everyone can vote and every vote can be counted.” Since then, Trump threatened to investigate him and send him to jail if re-elected, saying, “We are watching him closely, and if he does anything illegal this time, he will spend the rest of his life in prison.” 

Zuckerberg got the message, offering accolades, saying after last summer’s assassination attempt, “Seeing Donald Trump get up after getting shot in the face and pump his fist in the air with the American flag is one of the most badass things I’ve ever seen in my life…. On some level as an American, it’s like hard to not get kind of emotional about that spirit and that fight, and I think that that’s why a lot of people like the guy.”

Then there’s Amazon founder and Washington Post owner Jeff Bezos. When Trump was president, he frequently took aim at Amazon and Bezos because the Post published articles that angered Trump. He didn’t just criticize and threaten him; Trump also yanked a multi-billion-dollar cloud contract with the Defense Department from Amazon.

This time around, Bezos is doing Trump’s bidding. He canceled the Post’s planned endorsement of Vice President Kamala Harris even though the newspaper has endorsed candidates for president for decades. After Trump was elected, Bezos praised him, writing on X, “Big congratulations to our 45th and now 47th President on an extraordinary political comeback and decisive victory.”

Those are just two of tech titans who have praised Trump even though he had targeted them. Microsoft CEO Satya Nadella has so far managed to avoid getting on Trump’s bad side. He hasn’t gone out of his way to praise the president-elect, either, offering Trump only a pro forma congratulation after the election.

But with Musk as a Trump adviser, and what will likely be a big focus on AI in the new administration, it’s not clear whether Nadella will be able to stay out of Trump’s crosshairs. What’s also not clear is how Nadella will react if Trump threatens him — and how that might affect Microsoft’s financial future and its sense of itself as a moral company.

The M4 Pro Mac mini is a ‘triumph’

The “robust computer that’s very, very tiny” — introduced by Apple CEO Steve Jobs almost 20 years ago — just got even tinier. And once again, if you’re thinking of switching from Windows, there’s little excuse not to climb aboard; the “most affordable Mac ever” is also among the fastest consumer AI desktops money can buy.

While the Mac mini in hand is considerably smaller, its cost increased just a little and computational performance improved exponentially. These impressive changes allow the it to be a gateway for switchers, a second computer for any mobile Mac user, and a highly capable desktop for everyone else.

It’s also a server, a computer to which you can offload big tasks and it’s quite capable of handling the kind of cutting-edge productivity software you might use on a MacBook Pro, though perhaps not as efficiently. 

In the interests of objectivity, I should say up from I love the new Mac mini. It’s a triumph, a culmination of everything the first Mac mini aimed to be, but much, much better. Introduced along with the also superb MacBook Pro, Apple’s Mac line-up proves that, with Apple Silicon inside, the company is a the top of its game.

What you can expect under the hood

All this capability comes because of the amazing M-series processor Apple has slotted inside and reflects the device’s extensive processor history that straddles the company’s PowerPC chips on its first release, the Intel years, and today’s super-efficient, low-power chips that put Apple ahead of the industry. There’s a lot to love, starting at $599 (though the M4 Pro with 14‑core CPU and 20‑core GPU, 48GB, and 1TB SSD model I tested costs a lot more, $2,199.) That price tag might dent the superlatives a little, but probably not fatally. 

For a company made famous by the quality of its design, the Mac mini you see today isn’t a major departure from the models of yesteryear, other than size. This third major redesign remains faithful to the breed — a compact all-in-one metal box designed to work with the mouse, keyboard and display you already own. Now just 2-in. high, the 5-in.-by-5-in. (100% carbon neutral aluminum) box remains, resolutely, a Mac mini.

Such is the classic simplicity of Apple design, if you’d been abducted by aliens two decades ago and taken to the peaceful planet Zog to hang out with and learn from an enlightened species, you’d still recognize this as a Mac mini when you returned. (Though you’d probably be disappointed at the state of enlightenment here on Terra Ferma.)

But alien adventures aside, because it aims to work with kit you already own, connectivity has always been important to the mini. The new model offers two USB-C ports, HDMI, Gigabit Ethernet, three Thunderbolt 5 ports, a headphone jack, Wi-Fi 6E, and Bluetooth 5.3 — though you no longer get USB-A, putting that standard even further back in history. You also don’t get an SD card slot, but you didn’t in the last model, either.  You can now drive up to three external displays, which is amazing, really, and I bet many of us take that for granted.

The power button (which you rarely, if ever, need to touch) is on the lower left corner of the 1.6-pound device; that positioning raised many critical cat calls when it was spotted, but if that’s all the critics have then Apple has got something right.

What it does

Apple says the Mac mini with M4 Pro is up to 20x faster than the fastest Intel-based Mac mini. The benchmark results I got back that assertion up, and more. I was a little open-jawed at the results I got and had to run tests multiple times they impressed me so very much.

Time for some benchmarks:

Geekbench 6.3

  • Single Core: 3,8715.
  • Multi Core: 22,314.
  • OpenCL: 69,013

The CPU results are incredibly impressive. If you check the Geekbench Mac charts, you will find they mean the Mac mini delivers at least as much punch as the currently available Mac Studio, or last year’s 16-in. M3 Max MacBook Pro. There is no performance compromise whatsoever in this machine.

Cinebench R23

  • 22,737 CPU multi core (a top three position, up there with Intel Xeon W and AMD Ryzen Threadripper 2992WX).
  • 2,137 CPU single core (leader of the pack).

Valley

FPS 101.3 

It is important to note that Valley isn’t optimized for Apple Silicon and relies on Apple’s Rosetta technology, so it’s not a fair comparative test. But it does illustrate just how performant these little Macs have become.

You’ll find additional benchmark tests at MacStadium, where new M4 Mac minis are already being put into service as servers in real-life, mission-critical environments. They note that the M4 Pro, “tears past all the previously available Mac mini models, and even puts some of the older Studio models to shame.” 

You’ll find a similarly fabulous statement from an impressed Jeff Geerling, who says: “The chip isn’t the fastest at everything, but it’s certainly the most efficient CPU I’ve ever tested. And that scales down to idle power, too — it hovers between 3-4W at idle — which is about the same as a Raspberry Pi.” 

It is worth noting that most of the time the power efficiency means it will barely feel warm to the touch, no matter how hard you push it. These results, and those of all the other M4-powered Macs, absolutely illustrate the extent to which the shift to Apple Silicon has turned the processor industry upside down, putting once last-place Apple in bidding distance for the throne.

Take it anywhere

The Mac mini is small. You can put it anywhere you need it — on a bookshelf, certainly under a reception desk, anywhere in an office, and in almost any situation where you might need a computer on warehouse or factory floors. The front-mounted USB-C ports and headphone jack make its usage flexible, too. While it is not and nor is it intended to be a portable device, it is worth noting that so long as you have a keyboard, mouse, and display wherever you intend to go, the Mac mini is a computer you can take with you.

What about Thunderbolt 5?

Apple celebrated the introduction of Thunderbolt 5 on these Macs when they were announced. All the same, for most users it means very little. Sure, if you use a compatible Thunderbolt 5 cable and a compatible device, you’ll get data transfer speeds of 120Gbps, but right now those who have those things skew toward being pro gamers and video professionals. That will change of course as Thunderbolt 5 proliferates and becomes cheaper, though it is nice to know that you can use this tiny Mac to power multiple 6K displays.

Thunderbolt 5 will also be important to those who choose to use the new macOS feature that lets them use larger Mac apps that are stored on external SSD drives.

Time to upgrade?

The new model stacks up proudly against Apple’s first M1-series Mac mini. You’ll see significant performance gains, and while the M1 Mac mini I’ve used as my daily drive ever since it was introduced has never let me down, I did experience a perceptible difference in performance.

Four years later, is it time to upgrade? I think it might be, and the fact I’ve had four trouble-free years with an M1 gives me a lot of confidence to expect more great years with an M4 model.

However, in contrast to the Intel Macs, the question of whether or whether not to upgrade shouldn’t be a question at all — of course, you should. The difference in performance was like night and day when the M1 models first appeared; with the M4 series, you’ll feel like you just swallowed a glass of iced water in hell, as someone once said.

Unlike the performance compromise Mac mini represented back in the day, with Apple Silicon you can look forward to pro performance at a price that’s more within reach. 

A dream realized

The thing about the price is important. It’s hard to ignore a computer that starts at $599 and can kick out this level of performance. As a desktop, it ticks most boxes:

  • Windows switchers will like that they might be able to continue using existing kit with the system, and they’ll like it even more once they realize these Macs are so powerful they’ll run Windows better in VM mode than some PCs. 
  • Pro users will quickly find these Macs are capable of pro level performance that matches or exceeds some of last year’s more expensive Mac models.
  • Enterprises can be confident that these machines can be deployed across a wide array of situations and handle their tasks really well.
  • And every Mac mini user will appreciate that there is enough processor “oomph” inside these devices that we will still be enjoying a great experience using them in three, four, five or more years’ time. As mentioned above, my M1 Mac mini has never missed its stride and is four years old.

With its new – and still unmistakably Apple Mac mini design — the new model looks good, is whisper quiet, runs almost every application you might want to run, and demands hardly any desk space. If you need an Apple desktop or need to put an Apple system together at as low a price as possible, then the great thing about these Macs is you won’t feel at all compromised – these things shift!

All in all, this is a triumph, an absolute accomplishment of the journey Apple set out on when the first ever Mac mini models appeared. I can’t recommend it enough. 

You can follow me on social media! Join me on BlueSky,  LinkedInMastodon, and MeWe

Can you read your manager’s emails via Copilot?

Microsoft has released a new collection of tools and a guide to fix security issues that have arisen around the way the company’s generative AI (genAI) Copilot assistant handles information. Namely, the tool’s indexing of an organization’s internal data can lead to the AI ​​assistant sharing sensitive information when it shouldn’t.

A Microsoft employee familiar with customer complaints tells Business Insider: “Now, when Joe Blow logs into an account and starts Copilot, they can see everything. All of a sudden, Joe Blow can see the CEO’s email.”

Business Insider reports that the behavior prompted several organizations to delay using Copilot for security reasons. “Many data governance challenges associated with AI were not caused by AI’s arrival,” a Microsoft spokesperson told the publication.

Instead, according to the spokesperson, AI tools like Copilot highlight how companies need to take proactive responsibility for how they manage internal documents and other information.

Windows Recall is now available to test in limited preview

After a series of delays, Microsoft’s Windows Recall feature is now available in a limited preview for Copilot+ PCs. 

Recall, which takes regular snapshots of a user’s screen to provide a searchable timeline of actions, drew criticism from security and privacy experts when it was unveiled earlier this year. 

On Friday, Microsoft announced a new Windows 11 build for Windows Insider’s Dev Channel that includes Recall and Click To Do, another Copilot+ feature in preview that provides an interactive overlay on a user’s screen. 

The number of Windows Insiders testing the features is likely to be small at this stage, as Windows 11 Insider Preview Build 26120.2415 (KB5046723) can only be accessed on Copilot+ PCs with a Qualcomm Snapdragon chip. Support for devices running AMD and Intel chips is “coming soon,” Microsoft said in a blog post

 The announcement marks the next step toward a full release for a feature that was labeled a “privacy nightmare” upon its announcement in May. 

A planned rollout in June was postponed in response, and Microsoft has attempted to address security and privacy concerns with several updates. This includes making the feature “opt-in,” requiring biometric authentication with Windows Hello prior to use, blocking detection of personal details such as credit card details and passwords, and the addition of a “virtualization-based security enclave” (VBS Enclave) to secure data on a user’s device.

Intel’s CHIPS Act grant reduced as production delays and losses mount

The US government has scaled back Intel’s preliminary CHIPS Act grant from $8.5 billion to under $8 billion, reflecting concerns over the company’s delayed investments and financial woes, The New York Times reported. The funding was part of the government’s effort to boost domestic semiconductor manufacturing amid growing global competition.

Intel, originally seen as the largest beneficiary of the CHIPS Act, has struggled to meet expectations following its biggest quarterly loss in its 56-year history. The cut coincides with a $3 billion military contract offered to Intel to produce chips for the Department of Defense, the report said citing sources who did not wish to be identified.

In March 2024, the Biden administration and Intel signed a preliminary memorandum of terms (PMT) for an $8.5 billion funding package. This support was part of Intel’s broader plan to invest over $100 billion in expanding its US manufacturing operations, including the construction of new chip facilities in Arizona, Ohio, Oregon, and New Mexico.

The agreement also included up to $11 billion in additional loans from the US government, aimed at strengthening Intel’s position as a key player in the evolving AI-driven semiconductor landscape.

The decision to reduce the grant underscores the challenges Intel faces as it attempts to reclaim technological leadership while fulfilling the US administration’s vision of revitalizing domestic chip manufacturing.

However, there is no clarity on the other terms and conditions of the reduced grant package.

Investment delays and strategic setbacks

The funding reduction comes as Intel pushes back the timeline for completing its Ohio chip manufacturing project from 2025 to the end of the decade. The delay, coupled with persistent challenges in matching the technological advancements of rivals like Taiwan Semiconductor Manufacturing Company (TSMC), has dampened confidence in the company’s ability to deliver on its commitments.

“The delay in Intel’s investment is especially concerning given the current surge in demand for chips, driven by the rise of AI,”  said Rachita Rao, senior analyst at Everest Group. “With AI transforming the industry, the existing IT infrastructure is becoming insufficient to handle its processing requirements.”

Intel’s difficulties come as the Biden administration seeks to reduce US reliance on Asian supply chains through the CHIPS Act, a $39 billion initiative aimed at boosting domestic chip production. In March, President Joe Biden highlighted Intel’s role in transforming the semiconductor industry during a high-profile visit to Arizona.

However, Intel’s setbacks now present significant hurdles to achieving that vision, the report noted.

Oversight and milestones

Commerce Department officials, tasked with ensuring accountability for CHIPS Act funding, have set stringent performance benchmarks, such as building plants, producing chips, and securing customer commitments for domestically made products.

Intel’s struggles to meet these goals reportedly complicated its negotiations for the final grant terms, according to the report.

Meanwhile, TSMC has secured a $6.6 billion grant under the program while committing over $65 billion of its own funds to US factory construction.

“Additionally, Intel is pursuing riskier strategies at a time when TSMC is focusing on a low-risk, high-production model that appears to be yielding strong results,” Rao noted. “Given Intel’s inability to effectively compete in the current market, the reduction in funding seems justified to some extent.”

This, certainly, is not a piece of good news for Intel which has been grappling with significant financial challenges at the moment. The company reported an 85% year-on-year decline in profits and announced plans to cut 15,000 jobs recently. Additionally, the financial downturn has prompted Intel to suspend dividend payments.

The path ahead for US chip manufacturing

The Biden administration viewed the funding as a strategic initiative to lessen reliance on foreign semiconductor supply chains. The US has highlighted the program’s success in driving factory construction, pointing out that the country will soon host facilities from all five of the world’s leading chipmakers.

“Intel is struggling to keep pace with its competitors, particularly TSMC, which dominates the market with its competitive pricing and significant market share,” Rao said.

Intel’s success is vital not just for the company, but for the broader US semiconductor ecosystem. As AI is poised to drive future demand for advanced chips, Intel’s manufacturing capabilities and technological innovations will be crucial in ensuring the US remains competitive in the global market.

However, the reduction in Intel’s grant underscores the challenges of balancing federal investments with corporate accountability. A query to Intel remains unanswered.

Intel’s CHIPS Act grant reduced as production delays and losses mount

The US government has scaled back Intel’s preliminary CHIPS Act grant from $8.5 billion to under $8 billion, reflecting concerns over the company’s delayed investments and financial woes, The New York Times reported. The funding was part of the government’s effort to boost domestic semiconductor manufacturing amid growing global competition.

Intel, originally seen as the largest beneficiary of the CHIPS Act, has struggled to meet expectations following its biggest quarterly loss in its 56-year history. The cut coincides with a $3 billion military contract offered to Intel to produce chips for the Department of Defense, the report said citing sources who did not wish to be identified.

In March 2024, the Biden administration and Intel signed a preliminary memorandum of terms (PMT) for an $8.5 billion funding package. This support was part of Intel’s broader plan to invest over $100 billion in expanding its US manufacturing operations, including the construction of new chip facilities in Arizona, Ohio, Oregon, and New Mexico.

The agreement also included up to $11 billion in additional loans from the US government, aimed at strengthening Intel’s position as a key player in the evolving AI-driven semiconductor landscape.

The decision to reduce the grant underscores the challenges Intel faces as it attempts to reclaim technological leadership while fulfilling the US administration’s vision of revitalizing domestic chip manufacturing.

However, there is no clarity on the other terms and conditions of the reduced grant package.

Investment delays and strategic setbacks

The funding reduction comes as Intel pushes back the timeline for completing its Ohio chip manufacturing project from 2025 to the end of the decade. The delay, coupled with persistent challenges in matching the technological advancements of rivals like Taiwan Semiconductor Manufacturing Company (TSMC), has dampened confidence in the company’s ability to deliver on its commitments.

“The delay in Intel’s investment is especially concerning given the current surge in demand for chips, driven by the rise of AI,”  said Rachita Rao, senior analyst at Everest Group. “With AI transforming the industry, the existing IT infrastructure is becoming insufficient to handle its processing requirements.”

Intel’s difficulties come as the Biden administration seeks to reduce US reliance on Asian supply chains through the CHIPS Act, a $39 billion initiative aimed at boosting domestic chip production. In March, President Joe Biden highlighted Intel’s role in transforming the semiconductor industry during a high-profile visit to Arizona.

However, Intel’s setbacks now present significant hurdles to achieving that vision, the report noted.

Oversight and milestones

Commerce Department officials, tasked with ensuring accountability for CHIPS Act funding, have set stringent performance benchmarks, such as building plants, producing chips, and securing customer commitments for domestically made products.

Intel’s struggles to meet these goals reportedly complicated its negotiations for the final grant terms, according to the report.

Meanwhile, TSMC has secured a $6.6 billion grant under the program while committing over $65 billion of its own funds to US factory construction.

“Additionally, Intel is pursuing riskier strategies at a time when TSMC is focusing on a low-risk, high-production model that appears to be yielding strong results,” Rao noted. “Given Intel’s inability to effectively compete in the current market, the reduction in funding seems justified to some extent.”

This, certainly, is not a piece of good news for Intel which has been grappling with significant financial challenges at the moment. The company reported an 85% year-on-year decline in profits and announced plans to cut 15,000 jobs recently. Additionally, the financial downturn has prompted Intel to suspend dividend payments.

The path ahead for US chip manufacturing

The Biden administration viewed the funding as a strategic initiative to lessen reliance on foreign semiconductor supply chains. The US has highlighted the program’s success in driving factory construction, pointing out that the country will soon host facilities from all five of the world’s leading chipmakers.

“Intel is struggling to keep pace with its competitors, particularly TSMC, which dominates the market with its competitive pricing and significant market share,” Rao said.

Intel’s success is vital not just for the company, but for the broader US semiconductor ecosystem. As AI is poised to drive future demand for advanced chips, Intel’s manufacturing capabilities and technological innovations will be crucial in ensuring the US remains competitive in the global market.

However, the reduction in Intel’s grant underscores the challenges of balancing federal investments with corporate accountability. A query to Intel remains unanswered.

Just what the heck does an ‘AI PC’ do?

Virtually every PC manufacturer has announced, or is already producing, machines with embedded artificial intelligence (AI) functionality. The question is: Why?

Generative AI (genAI) for consumer use already exists through any number of cloud-based services, from OpenAI’s ChatGPT to Google’s Gemini and others.

Even so, next year will be “the year of the AI PC,” according to Forrester Research.

The research firm defines an AI PC as one that has an embedded AI processor and algorithms specifically designed to improve the experience of AI workloads across the central processing unit (CPU), graphics processing unit (GPU), and neural processing unit, or NPU. (NPUs allow the PCs to run AI algorithms at lightning-fast speeds by offloading specific functions.)

“While employees have run AI on client operating systems (OS) for years — think background blur or noise cancellation — most AI processing still happens within cloud services such as Microsoft Teams,” Forrester explained in a report. “AI PCs are now disrupting the cloud-only AI model to bring that processing to local devices running any OS.”

AMD, Dell, HP, Intel, AMD, Apple, Nvidia, and Lenovo have all been touting AI PC innovations to come over the next year or so. Those announcements come during a crucial timeframe for Windows users: Windows 10 will hit its support end of life next October, giving them a real reason to upgrade to Windows 11 — and buy new hardware.

Gartner’s latest worldwide AI PC shipment forecast projects a total 114 million units in 2025, an increase of 165.5% from this year. Key findings in the forecast include:  

  • AI PCs will represent 43% of all PC shipments by 2025, up from just 17% in 2024. 
  • The demand for AI laptops is projected to be higher than that of AI desktops, with shipments of AI laptops accounting for 51% of all laptops in 2025.
  • By 2026, AI laptops will be the only choice of laptop available to large businesses, up from less than 5% in 2023.

“The debate has moved from speculating which PCs might include AI functionality, to the expectation that most PCs will eventually integrate AI NPU capabilities,” said Ranjit Atwal, senior director analyst at Gartner. “As a result, NPU will become a standard feature for PC vendors.”

As the PC market moves to AI PCs, x86 processor dominance will lessen over time, especially in the consumer AI laptop market, as Arm-based AI devices grab more share from Windows x86 AI and non-AI laptops, according to Atwal. “However, in 2025, Windows x86-based AI laptops will lead the business segment,” Atwal said.

But why bother embedding AI algorithms in a computer’s firmware or software — thus, requiring more expensive processors to power them — when you can access those same tools on the web? According to Tom Butler, Lenovo’s executive director of worldwide commercial product management, AI will fundamentally transform PCs, making them not only smarter but also more responsive and secure.

“We see AI-enabled PCs evolving to provide more personalized, adaptive experiences that are tailored to each user’s needs,” Butler said. “The rise of generative AI was a pivotal moment, yet reliance on cloud processing raises concerns around data privacy.”

Each component of a PC plays a unique role in making AI tasks efficient, but the NPU is key for accelerating AI computations with minimal power consumption, according to Butler. In general, he said, AI PCs assist in or handle routine tasks to be more efficient and intuitive for users without the need to access an external website or service.

Apple, for example, last month announced an updated iMac powered by its new M4 chip with an NPU core and Apple Intelligence, an AI-powered assistant that can help users write emails or other content. (More intensive or complex tasks can still be handed off to OpenAI’s ChatGPT. Apple also unveiled M4-powered MacBook Pro laptops and Mac minis — all while touting their strength in handling AI-related tasks.

AI PCs can also boost productivity by handling routine tasks such as scheduling and organizing emails, and by enhancing collaboration with real-time translation and transcription features, according to Butler.

Stuff AI does on PCs

Intel Corp.

Depending on the device, AI technology can also seamlessly integrate with cloud and edge computing for real-time data processing, enabling faster and more informed decision-making. AI-enabled PCs also increase device security by automating threat detection and adapting to new threats as they arise.

For example, Butler said, Lenovo’s Smart Connect enhances device synergy, allowing users to transition seamlessly between Lenovo devices, while ThinkShield provides security across the ecosystem, protecting users in real time.

AI-powered PCs however, generally require more RAM to handle advanced tasks. Apple, for example, is moving from a minimum of 8GB of RAM to 16GB.

Lenovo took a slightly different approach in its RAM support of AI tasks. The company’s “Smarter AI for All” tries to match the complexity of tasks to processing needs. For example, 16GB is suitable for lighter AI tasks when combined with a more powerful NPU, while 32GB or more is suited for users handling complex applications, large language models, or deep learning.

“Users working within AI development spheres will most likely require more RAM, combined with powerful GPU and CPU to ensure low latency and AI model fine-tuning capabilities,” Butler said.

Could AI make things harder?

Ironically, though, the results of a new survey and study conducted by Intel found that current AI PC owners spend more time on tasks than people who use PCs without AI technology. The survey of 6,000 consumers from Germany, the UK and France, found about 53% believe AI-enabled PCs are only good for “creatives or technical professionals.” And 44% see AI PCs mainly as “a gimmick or futuristic technology.”

In all, the survey showed that, in general, users spend 899 minutes cumulatively, nearly 15 hours, on computer-related chores weekly. Intel’s study showed that current AI PC owners spend longer on tasks than their counterparts using traditional PCs because many spend “a long time identifying how best to communicate with AI tools to get the desired answers or response.

“Organizations providing AI-assisted products must offer greater education in order to truly showcase the potential of ‘everyday AI,'” Intel argued.

What saps time on a PC?

Intel Corp.

When its uses are understood, leveraging AI tools to handle repetitive tasks, streamline workflows, or even assist in research can greatly boost productivity, according to a 2023 study by AI safety and research company Anthrophic.

While only 32% of respondents who aren’t familiar with AI PCs would consider purchasing one for their next upgrade, that percentage jumps significantly to 64% among respondents who have used one before.  The survey and study also stated that “early data” suggests AI-enabled PCs can also save users about 240 minutes a week on routine tasks.

The problem is that many AI-PC owners simply aren’t aware of the benefits of AI or don’t know how to access the tools, Anthrophic argued. “Despite AI PCs becoming more available to people, 86% of respondents have either never heard of or used an AI PC. Meanwhile those respondents who already own an AI PC are actually spending longer on digital chores than those using a traditional PC. 

The study concluded that “greater consumer education is needed to bridge the gap between the promise and reality of AI PCs.”

For business-to-business (B2B) purposes, AI PCs offer a promising solution, according to Mike Crosby, executive director and industry advisory service Circana.

Just three of 20 US business sectors defined by federal government, including professional and scientific, finance, and health care, represent nearly 50% of the total AI PC unit sales, Crosby said. “Companies are evaluating these new technologies carefully, weighing the benefits of innovation against the risks to their established environments.”

The upcoming October 2025 sunset of Windows 10 support further amplifies the urgency for AI PCs, with nearly 60-70% of the installed base still on older versions. Microsoft’s extended security update (ESU) offers a temporary reprieve, but Circana expects modernization to ramp up quickly as the deadline approaches.

Just what the heck does an ‘AI PC’ do?

Virtually every PC manufacturer has announced, or is already producing, machines with embedded artificial intelligence (AI) functionality. The question is: Why?

Generative AI (genAI) for consumer use already exists through any number of cloud-based services, from OpenAI’s ChatGPT to Google’s Gemini and others.

Even so, next year will be “the year of the AI PC,” according to Forrester Research.

The research firm defines an AI PC as one that has an embedded AI processor and algorithms specifically designed to improve the experience of AI workloads across the central processing unit (CPU), graphics processing unit (GPU), and neural processing unit, or NPU. (NPUs allow the PCs to run AI algorithms at lightning-fast speeds by offloading specific functions.)

“While employees have run AI on client operating systems (OS) for years — think background blur or noise cancellation — most AI processing still happens within cloud services such as Microsoft Teams,” Forrester explained in a report. “AI PCs are now disrupting the cloud-only AI model to bring that processing to local devices running any OS.”

AMD, Dell, HP, Intel, AMD, Apple, Nvidia, and Lenovo have all been touting AI PC innovations to come over the next year or so. Those announcements come during a crucial timeframe for Windows users: Windows 10 will hit its support end of life next October, giving them a real reason to upgrade to Windows 11 — and buy new hardware.

Gartner’s latest worldwide AI PC shipment forecast projects a total 114 million units in 2025, an increase of 165.5% from this year. Key findings in the forecast include:  

  • AI PCs will represent 43% of all PC shipments by 2025, up from just 17% in 2024. 
  • The demand for AI laptops is projected to be higher than that of AI desktops, with shipments of AI laptops accounting for 51% of all laptops in 2025.
  • By 2026, AI laptops will be the only choice of laptop available to large businesses, up from less than 5% in 2023.

“The debate has moved from speculating which PCs might include AI functionality, to the expectation that most PCs will eventually integrate AI NPU capabilities,” said Ranjit Atwal, senior director analyst at Gartner. “As a result, NPU will become a standard feature for PC vendors.”

As the PC market moves to AI PCs, x86 processor dominance will lessen over time, especially in the consumer AI laptop market, as Arm-based AI devices grab more share from Windows x86 AI and non-AI laptops, according to Atwal. “However, in 2025, Windows x86-based AI laptops will lead the business segment,” Atwal said.

But why bother embedding AI algorithms in a computer’s firmware or software — thus, requiring more expensive processors to power them — when you can access those same tools on the web? According to Tom Butler, Lenovo’s executive director of worldwide commercial product management, AI will fundamentally transform PCs, making them not only smarter but also more responsive and secure.

“We see AI-enabled PCs evolving to provide more personalized, adaptive experiences that are tailored to each user’s needs,” Butler said. “The rise of generative AI was a pivotal moment, yet reliance on cloud processing raises concerns around data privacy.”

Each component of a PC plays a unique role in making AI tasks efficient, but the NPU is key for accelerating AI computations with minimal power consumption, according to Butler. In general, he said, AI PCs assist in or handle routine tasks to be more efficient and intuitive for users without the need to access an external website or service.

Apple, for example, last month announced an updated iMac powered by its new M4 chip with an NPU core and Apple Intelligence, an AI-powered assistant that can help users write emails or other content. (More intensive or complex tasks can still be handed off to OpenAI’s ChatGPT. Apple also unveiled M4-powered MacBook Pro laptops and Mac minis — all while touting their strength in handling AI-related tasks.

AI PCs can also boost productivity by handling routine tasks such as scheduling and organizing emails, and by enhancing collaboration with real-time translation and transcription features, according to Butler.

Stuff AI does on PCs

Intel Corp.

Depending on the device, AI technology can also seamlessly integrate with cloud and edge computing for real-time data processing, enabling faster and more informed decision-making. AI-enabled PCs also increase device security by automating threat detection and adapting to new threats as they arise.

For example, Butler said, Lenovo’s Smart Connect enhances device synergy, allowing users to transition seamlessly between Lenovo devices, while ThinkShield provides security across the ecosystem, protecting users in real time.

AI-powered PCs however, generally require more RAM to handle advanced tasks. Apple, for example, is moving from a minimum of 8GB of RAM to 16GB.

Lenovo took a slightly different approach in its RAM support of AI tasks. The company’s “Smarter AI for All” tries to match the complexity of tasks to processing needs. For example, 16GB is suitable for lighter AI tasks when combined with a more powerful NPU, while 32GB or more is suited for users handling complex applications, large language models, or deep learning.

“Users working within AI development spheres will most likely require more RAM, combined with powerful GPU and CPU to ensure low latency and AI model fine-tuning capabilities,” Butler said.

Could AI make things harder?

Ironically, though, the results of a new survey and study conducted by Intel found that current AI PC owners spend more time on tasks than people who use PCs without AI technology. The survey of 6,000 consumers from Germany, the UK and France, found about 53% believe AI-enabled PCs are only good for “creatives or technical professionals.” And 44% see AI PCs mainly as “a gimmick or futuristic technology.”

In all, the survey showed that, in general, users spend 899 minutes cumulatively, nearly 15 hours, on computer-related chores weekly. Intel’s study showed that current AI PC owners spend longer on tasks than their counterparts using traditional PCs because many spend “a long time identifying how best to communicate with AI tools to get the desired answers or response.

“Organizations providing AI-assisted products must offer greater education in order to truly showcase the potential of ‘everyday AI,'” Intel argued.

What saps time on a PC?

Intel Corp.

When its uses are understood, leveraging AI tools to handle repetitive tasks, streamline workflows, or even assist in research can greatly boost productivity, according to a 2023 study by AI safety and research company Anthrophic.

While only 32% of respondents who aren’t familiar with AI PCs would consider purchasing one for their next upgrade, that percentage jumps significantly to 64% among respondents who have used one before.  The survey and study also stated that “early data” suggests AI-enabled PCs can also save users about 240 minutes a week on routine tasks.

The problem is that many AI-PC owners simply aren’t aware of the benefits of AI or don’t know how to access the tools, Anthrophic argued. “Despite AI PCs becoming more available to people, 86% of respondents have either never heard of or used an AI PC. Meanwhile those respondents who already own an AI PC are actually spending longer on digital chores than those using a traditional PC. 

The study concluded that “greater consumer education is needed to bridge the gap between the promise and reality of AI PCs.”

For business-to-business (B2B) purposes, AI PCs offer a promising solution, according to Mike Crosby, executive director and industry advisory service Circana.

Just three of 20 US business sectors defined by federal government, including professional and scientific, finance, and health care, represent nearly 50% of the total AI PC unit sales, Crosby said. “Companies are evaluating these new technologies carefully, weighing the benefits of innovation against the risks to their established environments.”

The upcoming October 2025 sunset of Windows 10 support further amplifies the urgency for AI PCs, with nearly 60-70% of the installed base still on older versions. Microsoft’s extended security update (ESU) offers a temporary reprieve, but Circana expects modernization to ramp up quickly as the deadline approaches.

AWS and Anthropic ink deal to accelerate model development, enhance AI chips

The announcement that Amazon Web Services (AWS) will be Anthropic’s primary training partner confirms rumors of an even tighter partnership between the two companies.

They announced Friday that Anthropic will use AWS Trainium processors to train and deploy its Claude family of models. Further, as predicted earlier this month, Amazon will invest an additional $4 billion in the startup, making its total investment $8 billion.

AWS is already Anthropic’s primary cloud provider, and the OpenAI rival will now also primarily use Trainium and Inferentia chips to train and deploy its foundation models. Anthropic will also contribute to Trainium development in what the companies call a “hardware-software development approach.”

While it’s unclear whether the agreement requires Anthropic to exclusively use AWS chips, it is a move by Amazon to challenge the likes of Nvidia and other dominant players as the AI chip race accelerates.

“This is a first step in broadening the accessibility of generative AI and AI models,” Alvin Nguyen, Forrester senior analyst, told Computerworld.

Accelerating Claude development

Anthropic, which launched in 2021, has made significant progress with its Claude large language models (LLMs) this year as it takes on OpenAI. Its Claude 3 family comprises three LLMs: Sonnet, Haiku (its fastest and most compact), and Opus (for more complex tasks), which are all available on Amazon Bedrock. The models have vision capabilities and a 200,000 token context window, meaning they support large volumes of data, equal to roughly 150,000 words, or 500 pages of material.

Notably, last month Anthropic introduced “Computer Use” to Claude 3.5 Sonnet. This capability allows the model to use computers as people do; it can quickly move cursors, toggle between tabs, navigate websites, click buttons, type, and compile research documents in addition to its generative capabilities. All told, the company claims that Sonnet outperforms all other available models on agentic coding tasks.

Claude has experienced rapid adoption since its addition to Amazon Bedrock, AWS’ fully-managed service for building generative AI models, in April 2023, and now supports “tens of thousands” of companies across numerous industries, according to AWS. The foundation models are used to build a number of functions, including chatbots, coding assistants, and complex business processes.

“This has been a year of breakout growth for Claude, and our collaboration with Amazon has been instrumental in bringing Claude’s capabilities to millions of end users on Amazon Bedrock,” Dario Amodei, co-founder and CEO of Anthropic, said in an announcement.

The expanded partnership between the two companies is a strategic one for both sides, signaling that Anthropic’s models are performant and versatile, and that AWS’ infrastructure can handle intense generative AI workloads in a way that rivals Nvidia and other chip players.

From an Anthropic point of view, the benefit is “guaranteed infrastructure, the ability to keep expanding models’ capabilities, and showcase them,” said Nguyen, noting that it also expands their footprint and access.

“It’s showing that they can work well with multiple others,” he said. “That increases comfort levels in their ability to get training done, to produce models, to get them utilized.”

AWS, meanwhile, has a “’premiere client, one of the faces of AI’ in Anthropic,” said Nguyen.

From silicon through the full stack

As part of the expanded partnership, Anthropic will also help to develop and optimize future versions of AWS’s purpose-built Trainium chip. The machine learning (ML) chip supports deep learning training for 100 billion-plus parameter models.

Anthropic said it is working closely with AWS’ Annapurna Labs to write low-level kernels that allow it to interact with Trainium silicon. It is also contributing to the AWS Neuron software stack to help strengthen Trainium, and is collaborating with the chip design team around hardware computational efficiency.

“This close hardware-software development approach, combined with the strong price-performance and massive scalability of Trainium platforms, enables us to optimize every aspect of model training from the silicon up through the full stack,” Anthropic wrote in a blog post published Friday.

This approach provides an advantage over more general purpose hardware (such as Nvidia’s GPUs) that do more than what is “absolutely necessary,” Nguyen pointed out. The companies’ long partnership also means they may have mitigated performance optimization advantages that Nvidia has with their CUDA platform.

“This type of deep collaboration between the software and hardware engineers/developers allows for optimizations in both the hardware and software that is not always possible to find when working independently,” said Nguyen.

OpenAI is thinking about building its own browser

OpenAI is reportedly thinking about developing its own browser with the aim of challenging Google’s dominance in the market, according to The Information. The new browser would have built-in support for Chat GPT and Open AI’s search engine Search GPT.

OpenAI representatives have apparently held talks with developers from Conde Nast, Redfin, Eventbrite, and Priceline, but so far no agreements have been signed.

Shares of Google’s parent company Alphabet declined on the Nasdaq exchange after the browser plans became public, Reuters reported.