Microsoft Ignite 2024 kicks off in Chicago and runs Nov. 19-22. If you can’t make it to Chicago, no worries. First, the physical event is sold out, according to the Ignite event page. Second, it’s a hybrid event, so you can attend Ignite virtually.
Whether you’re there physically or online, expect to learn more about the latest technologies from Microsoft — everything from artificial intelligence (AI) to cloud computing, security, productivity tools, and more In the keynote address, Microsoft CEO Satya Nadella and Microsoft leaders — including Charlie Bell, executive vice president of Microsoft Security and Scott Guthrie, executive vice president of the Microsoft Cloud + AI Group — will share how the company is creating new opportunities across its platforms in this rapidly evolving era of AI.
You can also network with industry experts and Microsoft’s team, IT leaders, and other tech enthusiasts; gain hands-on experience and learn from experts at technical sessions; and learn about new products and services. (Microsoft often announces new products and features at Ignite.)
As you get ready for the event to start, here’s a look back at some of our previous Ignite coverage, as well as recent articles that touch on some of the topics you can expect to see at the event. And remember to check this page often for more on Ignite 2024.
Nov., 15, 2023: Microsoft’s 2023 Ignite conference might as well be called AIgnite, with over half of the almost 600 sessions featuring AI in some shape or form. Generative AI (genAI), in particular, is at the heart of many of the product announcements Microsoft is making at the event, including new AI capabilities for wrangling large language models (LLMs) in Azure, new additions to the Copilot range of genAI assistants, new hardware, and a new tool to help developers deploy small language models (SLMs) too.
Microsoft partners with Nvidia, Synopsys for genAI services
Nov. 16, 2023: Microsoft has announced that it is partnering with chipmaker Nvidia and chip-designing software provider Synopsys to provide enterprises with foundry services and a new chip-design assistant. The foundry services from Nvidia will be deployed on Microsoft Azure and will combine three of Nvidia’s elements — its foundation models, its NeMo framework, and Nvidia’s DGX Cloud service.
As Microsoft embraces AI, it says sayonara to the metaverse
Feb. 23, 2023: It wasn’t just Mark Zuckerberg who led the metaverse charge by changing Facebook’s name to Meta. Microsoft hyped it as well, notably when CEO Satya Nadella said, “I can’t overstate how much of a breakthrough this is,” in his keynote speech at Microsoft Ignite in 2021. Now, tech companies are much wiser, they tell us. It’s AI at heart of the coming transformation. The metaverse may be yesterday’s news, but it’s not yet dead.
Microsoft Ignite in the rear-view mirror: What we learned
Oct. 17, 2022: Microsoft treated its big Ignite event as more of a marketing presentation than a full-fledged conference, offering up a variety of announcements that affect Windows users, as well as large enterprises and their networks. (The show was a hybrid affair, with a small in-person option and online access for those unable to travel.)
Related Microsoft coverage
Microsoft’s AI research VP joins OpenAI amid fight for top AI talent
Oct. 15, 2024: Microsoft’s former vice president of genAI research, Sebastien Bubeck, left the company to join OpenAI, the maker of ChatGPT. Bubeck, a 10-year veteran at Microsoft, played a significant role in driving the company’s genAI strategy with a focus on designing more efficient small language models (SLMs) to rival OpenAI’s GPT systems.
Microsoft brings Copilot AI tools to OneDrive
Oct. 9, 2024: Microsoft’s Copilot is now available in OneDrive, part of a wider revamp of the company’s cloud storage platform. Copilot can now summarize one or more files in OneDrive without needing to open them first; compare the content of selected files across different formats (including Word, PowerPoint, and PDFs); and respond to questions about the contents of files via the chat interface.
Microsoft wants Copilot to be your new AI best friend
Oct. 09, 2024: Microsoft’s Copilot AI chatbot underwent a transformation last week, morphing into a simplified pastel-toned experience that encourages you…to just chat. “Hey Chris, how’s the human world today?” That’s what I heard after I fired up the Copilot app on Windows 11 and clicked the microphone button, complete with a calming wavey background. Yes, this is the type of banter you get with the new Copilot.
The European Commission has opened a formal investigation into whether US glass producer Corning, known for its Gorilla Glass, might have abused its dominant position in the market for protective glass for electronic devices. Corning’s products are used, among other things, in several of Apple’s and Samsung’s devices.
The Commission suspects the company might have entered into anticompetitive agreements with cell phone makers and glass refiners, including claims for exclusive purchases and discounts based on those pacts. Gorilla Glass has been used in mobile devices for more than a decade.
The agreements might have prevented competitors from entering the market, reducing consumer choice, raising prices and inhibiting innovation. If Corning is found guilty, the company could be fined. Before that happens, Corning will have the chance to respond to the European Commission’s objections and the investigation can be closed if the company fulfills certain commitments.
Discontinued in 2010, this was an Apple server that saw adoption as a supercomputer cluster, and found another use within movie industry workflows as a RAID system. Fans might be interested to know that an Xserve cluster at Virginia Tech ranked No. 7 on the Top 500 list of supercomputers in 2004, topping out at 12.25 teraflops of performance. (That, incidentally, is about the performance of an iPhone 12, or an M1-based Mac.)
Holding it wrong
Apple discontinued the Xserve with a famously terse Steve Jobs email apparently claiming “hardly anyone was buying it.”
Today, with what is arguably the world’s most performant low-power computer chips rolling off production lines, the Apple Silicon opportunity means the company is returning to the server market; it’s tasking Foxconn with making M4-powered servers to run Apple Intelligence as that service gets rolled out globally over the coming year.
Apple Intelligence servers are currently powered by the M2 Ultra chip, but Apple intends to upgrade these to M4 chips next year. It is alleged that the choice of Taiwan is deliberate, as the company hopes to gain some input from engineers who have worked on Nvidia servers, though as Apple Intelligence is an internal Apple project there’s no conflict of interest in that proposal — at least, not yet.
After all, Apple is not competing in the server market simply by making servers for its own AI, though its M4 Ultra chip might even outperform Nvidia’s mighty RTX 4090 processor, reports claim. So perhaps there’s a pathway there.
Apple now makes servers
Apple uses these servers for Apple Intelligence functions that require more power than the Apple device used to request the task. When those tasks are uploaded to the cloud, they are given to Apple’s own super-private servers or (optionally) outsourced to OpenAI.
To protect the flow of data, the company’s Private Cloud Compute is a server-based Apple Intelligence rig that lets Mac, iPhone, and iPad users exploit Apple’s own AI in the cloud. What’s important about the service is that it maintains the high privacy and security we already expect from Apple. That means Apple won’t get to see or keep your data and will not know what you’ve requested. “Private Cloud Compute allows Apple Intelligence to process complex user requests with groundbreaking privacy,” said Craig Federighi , Apple’s senior vice president of software engineering.
The idea is that you can use these LLM tools with peace of mind — the kind any rational person will require when handling their own information. I’ve argued before that this is what every cloud-based AI service should strive to deliver, though I don’t think they will; too many business models are based around capturing, exploiting, and even selling information about their users. That’s why some companies ban staff from using AI.
Perhaps it could sell or rent these servers?
The one thing Apple Intelligence has that perhaps isn’t being fully explained is that Apple also offers developers APIs so they can weave the generative AI technology into their products. Right now, that means introducing Apple Intelligence features within them, but given the importance of AI to developers, and the desire among some of them to make smart tools that can be used privately for specific use cases, at what point might Apple offer Private Cloud Compute as a service to provide trusted computing? Perhaps that is why it is putting the system through such rigorous security review?
There has to be an opportunity. There will be some companies who want to make their own AI solutions, but demand the kind of hardcore security Private Cloud Compute provides. Given that Apple has tasked Foxconn with making servers to support that service, at what point will provision of the servers, along with the bare bones, highly secure, software they run, become a business opportunity? There’s a business case, and given Apple is already leading the industry in just how willing it is to open these boxes up for security review, it feels like a potential direction — if there’s any money in it.
And there clearly is — quite a lot, in fact.
As everything becomes AI, where’s the money?
Recognition of the value and need for AI servers is, in part, what has driven Nvidia’s market cap to intermittently overtake that of Apple this year. The need for servers to provide support for AI is a growth opportunity for all in the space — except perhaps for Intel and AMD, who are watching as ARM’s reference designs define expectations for processor performance.
Whether it wants to be or not, Apple is in the server business, and now that it is, it makes sense for the company to generate more revenue from it. After all, who else promises the kind of rock-solid platform-focused security? Who else can provide such fast chips at such low energy requirements? The only snag in this particular ointment is that Apple Intelligence is not inherently cross-platform, though this hasn’t really got in the way of the company’s success for the last couple of decades.
In an internal meeting, Amazon CEO Andy Jassy responded to recent criticism from many employees about the company’s new plan for a full return to the office in January. The mandate means that as the beginning of the new year, almost all employees will have to be in the office five days a week.
Jassy said the aim is not to force any resignations among staffers or to satisfy decision-makers in cities, which were among the allegations made by angry employees, Reuters reports.
Employees have also objected that return-to-work plan is stricter than arrangements at other large tech companies and that it will make work less efficient due to commuting times. Jassy previously said his goal is to increase efficiency at work and promote collaboration and innovation.
AMD has launched its first open-source large language models (LLMs) under the OLMo brand, aiming to strengthen its position in the competitive AI landscape led by giants like Nvidia, Intel, and Qualcomm.
AMD OLMo is a series of 1-billion parameter large language models trained from scratch using trillions of tokens on a cluster of AMD Instinct MI250 GPUs. They are designed to excel in reasoning, instruction-following, and chat while embracing an open-source ethos that allows developers access to data, weights, training recipes, and code.
“Continuing AMD tradition of open-sourcing models and code to help the community advance together, we are excited to release our first series of fully open 1 billion parameter language models, AMD OLMo,” AMD said in a statement.
AMD’s open-source approach positions OLMo as an accessible and scalable option for companies seeking alternatives in AI technology. The model can be deployed in data centers or on AMD Ryzen AI PCs equipped with neural processing units (NPUs), allowing developers to leverage advanced AI directly on personal devices, the statement added.
“AMD is following Nvidia’s lead by expanding into the large language model (LLM) space alongside its well-established strength in computing hardware — a direction that Intel and Qualcomm have not yet fully embraced,” said Abhigyan Malik, practice director at Everest Group. “By fostering an open ecosystem, AMD enables developers to innovate and build diverse applications through a network effect.”
According to Malik, this strategy amplifies AMD’s core value proposition, particularly in driving demand for its underlying hardware, including AMD Instinct MI250 GPUs and Ryzen CPUs, where “AMD seeks to create lasting market impact.”
Extensive training and fine-tuning
The OLMo series follows a detailed three-phase training and fine-tuning process, according to AMD.
Initially, OLMo 1B was pre-trained on a subset of the Dolma v1.7 dataset using a transformer model focused on next-token prediction. This helped the model grasp general language patterns. In its second phase, the OLMo 1B was supervised and fine-tuned (SFT) on multiple datasets to refine its capabilities in science, coding, and mathematics.
The final model, OLMo 1B SFT DPO, was optimized with Direct Preference Optimization (DPO) based on human feedback, resulting in a model that effectively aligns its responses with typical user expectations.
Competitive performance and benchmark success
In internal benchmarks, AMD’s OLMo models performed well against similarly sized open-source models, such as TinyLlama-1.1B and OpenELM-1_1B, in multi-task and general reasoning tests, the company claimed. Specifically, its performance increased by over 15% on tasks in GSM8k, a substantial gain attributed to AMD’s multi-phase supervised fine-tuning and Direct Preference Optimization (DPO). ‘
In multi-turn chat tests, AMD claimed, OLMo showed a 3.41% edge in AlpacaEval 2 Win Rate and a 0.97% gain in MT-Bench over its closest open-source competitors.
However, when looking at the broader LLM landscape, Nvidia’s GH200 Grace Hopper Superchip and H100 GPU remain leaders in LLM processing, particularly for large, multi-faceted AI workloads. Nvidia’s focus on innovations like C2C link, which accelerates data transfer between its CPU and GPU, gives it an edge, providing a speed advantage for high-demand inference tasks such as recommendation systems.
Intel, while slightly behind in peak speed, leverages its Habana Gaudi2 accelerator for cost-effective yet robust performance, with future upgrades planned for increased precision. ‘
Meanwhile, Qualcomm’s Cloud AI100 emphasizes power efficiency, meeting the needs of organizations seeking high AI performance without the extensive energy demands associated with Nvidia’s high-end systems.
AMD’s OLMo models also showed strong performance on responsible AI benchmarks, such as ToxiGen (for toxic language detection), crows_pairs (bias assessment), and TruthfulQA-mc2 (accuracy). These scores reflect AMD’s commitment to ethical AI, an essential focus as AI integration scales across industries.
AMD’s position in the AI market
With its first open-source LLM series, AMD is positioned to make significant inroads in the AI industry, offering a compelling balance of capability, openness, and versatility to compete in a market currently led by Nvidia, Intel, and Qualcomm.
However, AMD’s ability to close the gap will depend on how well its open-source initiative and hardware enhancements keep pace with rivals’ advances in performance, efficiency, and specialized AI capabilities.
“AMD’s entry into the open-source LLM space strengthens the ecosystem, potentially lowering the operational costs associated with adopting generative AI,” said Suseel Menon, practice director at Everest Group.
AMD’s move into LLMs places it against established players like Nvidia, Intel, and Qualcomm, who have gained market prominence with their proprietary models.
“This move also puts pressure on proprietary LLMs to continually innovate and justify their pricing structures,” Menon added.
Analysts believe AMD’s unique open-source strategy and accessibility aim to attract enterprises and developers looking for flexible, affordable AI solutions without proprietary constraints.
“For large enterprises with long-term data privacy concerns, AMD’s open-source model offers a compelling alternative as they navigate AI integration,” Menon added. “By building a cohesive, full-stack AI offering that spans hardware, LLMs, and ecosystem tools, AMD is positioning itself with a distinct competitive edge among leading silicon vendors.”
Cloud architects, data security engineers, and ethical hackers are among the highest-paying skills that can be attained through IT certifications — and AI technology didn’t even make the list.
Online learning platform Skillsoft analyzed the top reported salaries of IT professionals around the world to find the highest-paying certifications and developed a list of more than 20.
This year’s list shows that cloud computing skills remain in high demand and can be quite lucrative. The AWS Certified Security Specialty training jumped from sixth-highest to the top-paying certification this year to now command a $204,000 annual salary on average — a up 22% or $40,000 over last year.
The presence of certifications for Google Cloud Platform (GCP), AWS, Azure, and Nutanix also highlights the value of a diverse cloud skillset, as organizations adopt multi-cloud or hybrid cloud strategies, according to Skillsoft.
Its list is similar to one published earlier this year by job search platform Indeed, which also placed an AWS certification in the No. 1 slot. (Indeed found AWS Certified Solutions Architects could earn from $133,200 to $246,900 a year at some firms.)
“So, are they worth it? For those looking for any of the above, it’s a resounding yes,” Skillsoft said a blog post. “But, earning a certification takes time, effort, and often money.”
Are certifications worth the price?
Earning a certification led to pay raises, promotions and new jobs, according to Skillsoft. In addition to AWS training, rounding out the top five certifications were:
Google Cloud – Professional Cloud Architect, averages $190,204.
Nutanix Certified Professional – Multicloud Infrastructure (NCP-MCI) v6.5, averages $175,409.
Gartner Research, in an August report, also found that AWS Certified Cloud Practitioners and Microsoft Certified Azure Fundamentals certifications were top upskilling opportunities for tech workers. Other IT certifications with fast-growing demand this year are in cybersecurity, including the CISSP certification, CISA, and CompTIA Security+, according to Gartner. (The latter — IT certifications from the Computing Technology Industry Association (CompTIA), a non-profit trade association — were also among the general class of top certifications on multiple lists.)
“While learning new technology skills is vital, the ability for employees to demonstrate practical expertise through industry-recognized certifications is increasingly valued,” Gartner said. “Though they may not be a mandatory prerequisite for every position, certifications can empower individuals and organizations alike.”
“Our data suggests that tech professionals skilled in cloud computing, security, data privacy, and risk management, as well as able to handle complex, multi-faceted IT environments, will be well positioned for success,” said Greg Fuller, vice president of online learning platform Codecademy Enterprise. “Overall, the IT job market is characterized by a significant imbalance between supply and demand, which continues to drive salaries higher.”
What’s happening with AI training?
While AI certifications have not yet to the top of IT certification lists, the increasing emphasis on data privacy and compliance is closely tied to the rollout of AI technologies. And while AI skills are gaining popularity, it often takes time for certifications to gain traction, Fuller said.
“Right now, what we see with areas like AWS Security at the top is that organizations are still preparing for large scale AI rollouts,” he said. “So more adjacent skills are on this year’s list. Ultimately, it’s a mix of certifications being a bit slower to evolve and adjacent skills rising in criticality.
“In the meantime, the backbone of AI is cloud, so getting cloud certified is a good first step. Then, look at some of the more specialized Cloud AI certifications,” Fuller added.
Recruitment and talent consulting firm WilsonHCG released a report this week indicating that while AI certifications might not be on the top 20 lists, there is rising demand for AI skills across sectors. The market for AI-skilled workers is expanding, too, with 5,898 average monthly job postings in October, according to WilsonHCG.
The rise in the number AI-focused certifications reflects a significant increase from the 12-month average of 5,147, driven by heightened interest in roles like data scientist, AI research engineer, and machine learning engineer.
Companies such as TikTok, Apple, Google, Amazon, and Deloitte are among the most active in AI recruitment, underscoring the technology’s growing adoption in sectors from tech to finance and professional services, according to WilsonHCG.
“The need for AI skills extends beyond traditional tech positions. Companies are seeking professionals across a range of roles, including Founding AI Engineer and Senior Software Engineer for AI products,” WilsonHCG said in its report. “This trend is reshaping hiring practices and job titles as more organizations prioritize data-driven and AI-enabled functions across departments.”
Skills continue to matter more than formal education
Skills-based hiring approaches that emphasize strong work backgrounds, certifications, assessments, and endorsements, continue to dominate the tech industry. And soft skills are becoming a key focus of hiring managers, even over hard skills.
Elise Smith, co-founder and CEOofPraxis Labs, an AI-based learning platform, said she has worked with enterprises like Google, Uber, and ServiceNow to help senior leaders develop the skillsets needed for “new-age talent retention” and collaboration in the workplace.
“As workplaces continue to transform — whether its emerging technologies like genAI transforming how we work or sociopolitical conflicts that cause disruption to our workflows — human skills will become more and more important,” Smith said.
What’s often missing from higher education is a focus on skills building around interpersonal communication, conflict resolution, critical reasoning, and the ability to determine fact from opinion or misinformation. “What once may have been called soft skills will be seen as power skills, and workforces who focus and develop these skills will differentiate in market outcomes,” Smith said.
While building relations and moving beyond “transactional trust” in the workplace can be challenging — especially for a hybrid global workforce — it’s important to build skills around workplace connection.
“When managers are skilled in asking open-ended questions, coaching disengaged team members, learning more about individuals’ backstories and contexts, and encouraging them in their work, teams thrive,” she said. “These are the skillsets we help our clients and their people leaders develop.”
The UK government has introduced an AI assurance platform, offering British businesses a centralized resource for guidance on identifying and managing potential risks associated with AI, as part of efforts to build trust in AI systems.
About 524 companies now make up the UK’s AI sector, supporting more than 12,000 jobs and generating over $1.3 billion in revenue, the UK government said. Official projections estimate the market could grow to $8.4 billion by 2035.
The Mozilla Foundation, the nonprofit organization behind the Firefox open-source browser, said it has laid off about 30% of its employees as part of a reorganization to increase its “agility.”
As of 2023, the foundation had between 80 and 300 employees, according to varying reports. A spokesperson declined to say how many employees the company has now.
Established in 2003, the group is best known for its development of the Firefox web browser, as well as its advocacy for internet privacy, digital rights, and freely-available, open-source software.
A Mozilla Foundation spokesman said the non-profit is reorganizing teams to boost agility and impact as it accelerates efforts for “a more open and equitable technical future. That unfortunately means ending some of the work we have historically pursued and eliminating associated roles to bring more focus going forward,” Brandon Borrman, vice president of Mozilla’s communications, said in a statement to Computerworld.
The non-profit arm is distinct from the Mozilla Corporation, which is the for-profit company responsible for generating revenue through products like the web browser. The corporation employs a much larger number of people, likely 700 or more.
The Mozilla Foundation’s executive director, Nabiha Syed, said in an email last week that two of the foundation’s major divisions — advocacy and global programs — are “no longer a part of our structure,” according to a TechCrunch report.
Contrary to reports, however, Borrman said the restructuring will not impact its goal of open-source and free internet advocacy. “On the contrary, advocacy is still a central tenet of Mozilla Foundation’s work,” he said. “Fighting for a free and open internet will always be core to our mission, and advocacy continues to be a critical tool in that work. We are in the process of revisiting our approach to it.”
Along with the Mozilla Foundation, Mozilla currently consists of five organizations: the Mozilla Corporation, which leads consumer product-based work; Mozilla Ventures, a “tech-for-good” investment fund; Mozilla.ai, an AI R&D lab; and MZLA, which makes Thunderbird.
In 2020, the Mozilla Corporation cut about 25% of its 1,000-person global workforce, saying that the coronavirus pandemic’s impact on economies “significantly impacted our revenue.”
Borrman said the layoffs did not affect any of the other Mozilla entities.
Let’s face it: For many people, web browser performance could well be more important than general PC performance.
Browser makers are wising up to this, too. Google Chrome just introduced new performance controls, while Microsoft Edge has attempted to stand out with its own browser performance options. And every web browser out there has long fought over the title of fastest in the land.
So let’s talk browser performance — and how you can get more of it, specifically when working within Windows. In a world where websites feel like they’re getting heavier and heavier, upgraded browser performance means everything from faster load times and a better all-around browsing experience to more reliable all-around PC performance and longer laptop battery life.
Want more Windows PC tips? My free Windows Intelligence newsletter delivers all the best Windows tips straight to your inbox. Plus, you’ll get free in-depth Windows Field Guides as a special welcome bonus.
Windows web browsing boost #1: Cull your extensions
Does your browser feel inexplicably slow? Before you do anything else, I’d recommend pruning any installed browser extensions. Add-ons can be useful, but they can also add some serious overhead to your browsing. They may be always running in the background, or they may run some code on each web page you load.
In Google Chrome, you can click the main three-dot menu icon > Extensions > Manage Extensions to see a list of what’s installed. From there, you can disable or remove them. Other browsers have a similar menu and mechanism, potentially with slightly different placement and phrasing.
You might want to try disabling a few browser extensions first to see if your browser feels faster. If not, you can easily re-activate them by flipping their switches back on in that same area of your browser’s settings.
Windows web browsing boost #2: Put those tabs to sleep (or keep them awake)
Modern web browsers — including Chrome, Edge, and Mozilla Firefox — all have features that put tabs to “sleep.” If you don’t use a tab for a while, your browser will stop it from running. It won’t be able to use resources in the background. When you click back to the tab, your browser will reactivate it.
This saves memory, and it also stops pages in background tabs from using CPU resources. Overall, it will boost your browsing speed.
However, in some cases, it could slow things down. Perhaps you often find that you switch back to a tab and your web browser quickly reloads it. If that’s a problem, you’ll want to make your browser stop putting tabs to sleep — especially if you have a powerful computer with a lot of RAM and a fast CPU. (You can also tell your browser to stop putting specific websites to sleep if it causes a problem with a website.)
To control tab suspending:
In Google Chrome, click menu > Settings and select “Performance” in the left pane. Look under “Memory Saver” and choose an option: Moderate, Balanced, or Maximum. You can also disable Memory Saver entirely — or add websites you never want Chrome to suspend to the “Always keep these sites active” list there.
In Microsoft Edge, click menu > Settings and select “System and performance” in the left pane. Use the “Save resources with sleeping tabs,” “Put inactive tabs to sleep after the specified amount of time,” and “Never put these sites to sleep” options to control this behavior.
In Mozilla Firefox, this feature is always activated — unless you dig deep into Firefox’s settings to turn off tab unloading.
Modern browsers can suspend, sleep, or unload tabs to save system resources. These are all names for the same trick.
Chris Hoffman, IDG
Windows web browsing boost #3: Preload more pages
Your web browser of choice can “preload” some pages. In other words, it might load them in the background if it thinks you’ll visit them. If you do, the page loads very quickly — because by the time you’re looking at it, it’s already loaded in the background and ready to go!
Most browsers offer different preloading options, some of which are more aggressive than others. And preloading has some potential privacy implications, as your browser might load links you wouldn’t have clicked. But, for maximum speed, you’ll probably want the most aggressive preloading options available.
To control preloading:
In Google Chrome, click menu > Settings and select “Performance” in the left pane. Scroll down to the “Preload pages” option. For maximum speed, ensure “Preload pages” is active and that it’s set to “Extended preloading.”
In Microsoft Edge, click menu > Settings and select “Cookies and site permissions” in the left pane. Click “Manage and delete cookies and site data,” and ensure “Preload pages for faster browsing and searching” is activated.
Windows web browsing boost #4: Check your browser’s task manager
Want to see what’s actually using CPU and memory? Modern Chromium-based web browsers — including Chrome, Edge, Brave, Arc Browser, and more — have task managers that will show you. (Firefox has something similar, too.)
In a Chromium-based browser, just right-click an empty spot on the tab bar and select “Task Manager” or press Shift+Esc to open it.
You will see a list of processes — including open web pages, browser extensions, and browser components — along with how much CPU and memory they’re using. If your web browser is mysteriously slow, this is a good place to check: You might spot an open web page that’s dragging everything down, and you can close it from here. You can also click the “CPU” heading to sort processes by CPU and see the most CPU-hungry items at the top of the list.
In Firefox, you can access something similar by plugging about:processes into Firefox’s address bar and pressing Enter. (The Shift+Esc shortcut will work, too!)
Your browser’s task manager will show you if a web page or browser extension is hogging system resources.
Chris Hoffman, IDG
Windows web browsing boost #5: Clear your browser cache (or stop clearing it)
Ah, the browser cache. As you browse, your web browser remembers the pages you visit and the things you type in a history, it stores images and other bits of downloaded pages in a cache, and it keeps cookies with information from websites — like your sign-in status.
Many people frequently clear this browser cache. If your browser is slow, you can try clearing browsing data. In fact, Microsoft’s official Edge browser documentation says “Clearing your browser data on a regular basis will improve the performance of your browser” — and who am I to argue with Microsoft? Surely, it understands how its own browser works.
Clearing that data is worth a shot. But, conversely, if you’re clearing your browsing data too regularly, you might want to stop doing that. The browser cache is there to speed things up: Your browser can pick images and other bits of web pages out of its cache rather than redownloading them, which improves load times when you revisit a page.
You’ll find options for clearing browsing data in your browser’s menu, but you can also just press Ctrl+Shift+Delete to quickly open the browser-history-clearing tool.
Clearing your browser data can speed things up — but clearing your browser cache too aggressively can also slow down page-load times.
Chris Hoffman, IDG
Windows web browsing boost #6: Scan for malware
We have to talk about malware for a minute. Whenever a PC is running mysteriously slow, malware is always one of the first things you should check for.
Windows web browsing boost #7: Switch up your ad-blocker
When it comes to ad-blockers, one thing people don’t often talk about is the fact that such systems can both speed up and slow down your browsing. The speed-up part is obvious: By refusing to load advertising resources on web pages, ad-blocking plugins reduce download size and produce a lighter page that opens more quickly.
But there’s also a slow-down factor: Ad-blockers might also run extra code on the pages you visit, increasing memory use and making them take longer to load.
Different ad-blockers will have different effects on performance. There’s been a lot of controversy about Google Chrome’s switch to Manifest V3 and how it stops the popular “classic” uBlock Origin ad-blocker from functioning. But here’s the thing: While the new way Chrome blocks ads with Manifest V3 is less powerful, it’s also faster. So if you happen to be using uBlock Origin and install the new uBlock Origin Lite, you could see improved page load speeds.
That’s because those new Manifest V3-compatible ad-blocker extensions work by providing a list of resources they want to block. The Chrome browser engine then blocks those resources. That means the ad-blocking browser extension itself doesn’t have to get involved and run a bunch of code on the pages you access.
If you want a speed boost, it’s something worth chewing over. If you’re not yet using any ad-blocker, consider installing one. If you are using an ad-blocker, consider switching — for example, to something like uBlock Origin Lite.
Just bear in mind that you might occasionally break a page; you might need to turn it off for a page if you run into issues.
Windows web browsing boost #8: Try a fresh browser profile
To be clear: I’m not recommending you run out and factory-reset your PC! But popular browsers have built-in “fresh start” tools that will clean up your browser profile and its settings, wiping away any configuration changes, disabling extensions, and erasing cached files to give you a like-new browser. It’s worth a shot.
Here’s how to do it:
In Google Chrome, click menu > Settings and select “Reset Settings” in the left pane. Use the “Restore settings to their original defaults” option.
In Microsoft Edge, click menu > Settings and select “Reset Settings” in the left pane. Click the “Restore settings to their default values” option.
In Mozilla Firefox, click menu > Help > Troubleshoot Mode. You can then click “Refresh Firefox” in the dialog box that opens.
It’s a good way to start over. And hey — if you’re experiencing any kind of PC performance issue, browser-related or otherwise, the old standby advice is always good: Try turning it off and on again.
There’s more where this came from! My free Windows Intelligence newsletter delivers all the best Windows tips straight to your inbox. Plus, you’ll get free copies of Paul Thurrott’s Windows 11 and Windows 10 Field Guides (a $10 value) just for subscribing.
Zooming out, will Donald Trump’s victory in the US presidential race give Apple more bargaining power when negotiating with European regulators — and to what extent will the ongoing US anti-trust investigation of the company (shaky as it is) gain presidential support?
Those could be the kind of questions Apple CEO Tim Cook is asking himself this morning as the former President inches toward a new administration in 2025. We can surmise this based on what Trump said during the campaign, when he explained how Cook rang him up to complain about the fines levied against the company by Europe.
What Trump told Cook
Speaking on a podcast, the incoming President alleged that he told Cook he would not let the EU “take advantage of our companies.” If he keeps that promise, this suggests we may have a new entrant in the Europe versus Apple (and hence, Big Tech) ring. With more regulatory investigation — including the first-ever potentially $38 billion fine under the Digital Markets Act (DMA) — headed Apple’s way in Europe, could the former and future president intervene some how?
It’s hard to tell; after all, most people have become cynical about politicians and the promises they make (and later break) on the campaign trail. We don’t know yet whether the next Trump administration will keep promises made on the way to the White House, or just cherry pick those it wants to keep and ignore the rest.
If the new government does choose to support American tech business against what Republicans might see as overreach by the EU “deep state,” then the next time Europe decides to take a few billion from Cupertino things might not go quite so easy.
When the gloves are off, what happens?
While it is understandable that Europe desperately wants to blunt foreign behemoths in the tech sector in a strategic attempt to support the growth of its own players in that space, it is possible that plan may fail. After all, as events in Valencia, Spain, suggest, Europe has other problems.
The thing is, given the inherent nativism of so much of Project Trump, can European regulators afford to play hardball here? Future history will tell. But there is no doubt the answer to these questions does matter to many in the US tech sector, and also, inevitably, to supply chain partners elsewhere.
One thing that does seem likely is that Apple’s investment in manufacturing in India will continue to accelerate, as the new administration seems set to continue the policies toward China it maintained last time it held power. Cook’s strategic vision to set up shop in India seems likely to pay long-term dividends, as does the considerable work the company has already done and continues to do to repatriate jobs to the US — an ongoing effort on which it has spent hundreds of billions of dollars so far.
Cook, meanwhile, will continue to follow his own approach toward engaging with others who hold opinions he perhaps does not share. “Personally, I’ve never found being on the sideline a successful place to be,” he told employees in 2019.
That approach led him to become one of Trump’s top tech advisors during the first administration.
Cook’s way of doing things also seems to have won some support from Trump, who recently said he thought that if Cook didn’t run Apple it wouldn’t be nearly as successful as it is now. “I think Tim Cook’s done an amazing job,” he said. “And I’m not knocking Steve Jobs.”
Trump also seemed impressed at the eye-watering size of Europe’s fines levied against Apple, which he characterized as “a lot.” With all of this in mind, it is perhaps important to note that Trump in 2019 said Cook has a direct line to the (now) newly-re-elected President.
The art of the deal
Might this contribute to the art of some kind of new EU deal? We don’t know that, either, but as America — and the world — digests the election results, it might yet prove an important moment for Apple’s business, too. European regulators need to think about it.