Thursday, May 7, 2026

 

QUALCOMM


New Snapdragon processors: a giant leap for affordable cell phones

When we think about major technological innovations in cell phones, our minds almost always and immediately go to those top-of-the-line models that cost over a thousand euros. But the stark truth is that the vast majority of us are simply looking for a device that does its job flawlessly, without emptying our bank account. Qualcomm, the famous brand that builds the "brains" of the overwhelming majority of Android smartphones in circulation, is perfectly aware of this reality and has just revealed a genuine revolution for the mid-range and entry-level segments. Get ready to meet the brand-new Snapdragon 6 Gen 5 and Snapdragon 4 Gen 5 processors.

Qualcomm's main focus for this new generation of processors is very simple and direct: to improve the functionalities you actually use in your day-to-day life. Instead of focusing solely on achieving stratospheric synthetic benchmark numbers that mean absolutely nothing to the average user, the company has focused on creating what it calls "next-generation capabilities designed for real-world experiences."

One of the biggest and most visible new features shared by both processors is the introduction of the new Snapdragon Smooth Motion UI interface. What does this mean for you in practice? It means that annoying stuttering when opening a heavier app or quickly navigating through your phone's menus will definitely become a thing of the past. Navigation becomes incredibly smoother, ensuring your smartphone responds to your fingers immediately and without frustrating hesitations.

Snapdragon 6 Gen 5: Cutting-edge photos and flawless gaming...If you're a user who demands a little more from your machine but categorically refuses to pay luxury prices, the Snapdragon 6 Gen 5 was designed with you in mind. This is the more robust and powerful of the two new chips and brings updates that will transform the way you capture the world around you. In the demanding photographic department, Qualcomm has integrated direct support for AI-based "Night Vision." This will allow you to take stunning night photos on mid-range phones, something that was previously reserved for the elite. Furthermore, the processor supports fast 32MP capture without any lag and even an unbelievable 100x digital zoom.

But the fun doesn't stop at photography. For avid video game lovers, the brand has introduced the latest version of its Adaptive Performance Engine 4.0. This intelligent system was meticulously designed to offer you long and uninterrupted gaming sessions, surgically managing power delivery so that the phone doesn't overheat in your hands. To help with this visual feast, the performance of the graphics unit (the Adreno GPU) has taken a tremendous 21% leap. This not only improves your games but also gives a strong visual "boost" to all the applications you use daily.

Snapdragon 4 Gen 5: 90fps arrives on the cheapest phones...Don't think, however, that Qualcomm has forgotten the more economical phones on the market. The new Snapdragon 4 Gen 5 brings with it a truly historic milestone for this family of basic processors. For the very first time in the history of the 4 series, you'll be able to enjoy video games at 90 frames per second (fps).

If you've ever tried playing a fast-paced action title on a cheaper phone and felt the image painfully dragged, this technological innovation will change everything. Playing at 90fps offers a brutal competitive advantage and visual fluidity that, until very recently, was strictly and exclusively reserved for high-end models. This proves that the technical barrier between expensive and more affordable phones is becoming increasingly tenuous and difficult to justify.

These fantastic innovations clearly show that the industry is finally realizing that true innovation doesn't have to be an exclusive benefit for those with deeper pockets. With the arrival of these two new processors on the assembly lines of major mobile phone manufacturers, the mid-range and entry-level devices that will land on store shelves in the coming months promise to deliver a super rich and fluid user experience. If you're considering upgrading your device soon and are on a tighter budget, just wait a little longer, because your next affordable smartphone will have a much brighter "brain."

Qualcomm just widened its mobile roadmap with two new chips that aim to make cheaper phones feel less cheap. The Snapdragon 6 Gen 5 and Snapdragon 4 Gen 5 promise faster app launches, better gaming, and longer battery life, while pushing Wi‑Fi 7 and newer AI features deeper into the midrange and entry tiers.

According to Qualcomm, the Snapdragon 6 Gen 5 brings "advanced capabilities to more devices” such as AI-powered camera features, immersive gaming, and more efficient performance, whereas the Snapdragon 4 Gen 5 is marketed to make essential connectivity and gaming more accessible without sacrificing battery life.

Between the two, the 6 Gen 5 appears to be less ambitious. The spec sheet says it delivers up to 20% faster app launches, 18% less screen stutter, and as much as 21% better GPU performance over the previous Gen 4. These meaningful upgrades are a counterbalance to odd downgrades, namely trading out mmWave 5G and L2 GPS band support plus USB 3.2 for USB 2.0. 

As for the cheaper stablemate, Qualcomm says that the 4 Gen 5 brings 43% faster app launches, 25% less screen stutter, and a whopping 77% jump in GPU performance over 4 Gen 4, along with 90 fps gaming in the 4-series for the first time, not to mention Dual SIM Dual Active 5G connections. And yes, for all the positives, there's a negative here as well: RAM support have dropped from DDR5 speeds to DDR4X

With the new Snapdragon 4 and 6, it's obvious where Qualcomm is choosing to spend its silicon budget. The company is leaning into “Smooth Motion UI,” stronger battery efficiency, and connectivity upgrades, betting that users are more likely to appreciate fewer stalls, faster loading, and steadier wireless links than niche specs they’ll never touch.

Qualcomm states that both chipsets will show up in commercial devices in the second half of 2026, with Honor, OPPO, realme, REDMI, and Xiaomi among the initial launch partners. Yan Chen Wei, Senior Vice President at Qualcomm said, “This launch underscores our focus on delivering impactful solutions, with each platform intentionally designed to strike the right balance of performance, power efficiency and connectivity—helping our partners deliver next-generation smartphone experiences to more users globally.”

by mundophone


TECH


How will Samsung manage to lower the price of its next foldable phones?

Samsung seems poised to break one of the most deeply rooted traditions in its premium device line. If you follow the market, you know that for years, the South Korean giant's foldable phones were the last bastion where Qualcomm reigned unopposed. However, a radical change in processor strategy is looming on the horizon, and the upcoming Galaxy Z Flip 8 may be the protagonist of a small disruption that promises to generate buzz among tech enthusiasts.

According to recent leaks, detected in Samsung's own source code by informant Erencan Yılmaz, the Galaxy Z Flip 8 may not arrive on the market with a single processor globally. The brand is reportedly considering — or at least testing — a split strategy: one version equipped with Qualcomm's future Snapdragon and another powered by Samsung's own Exynos processor.

This isn't a completely unprecedented maneuver, as we've already seen signs of this transition in the Galaxy Z Flip 7, but the scale this time seems to be different. The big question is whether this division will be geographical, as was the case in the past with the Galaxy S line, or whether Samsung is simply keeping its options open before pressing the mass production button. For you, as a user, this means that the choice of model may depend much more on where you buy or how much you are willing to pay.

While the waters seem turbulent with the "Flip" model, when it comes to book-style design, Samsung prefers not to take risks with winning teams. Everything indicates that both the Galaxy Z Fold 8 and the new Galaxy Z Fold 8 Wide will maintain exclusivity with Qualcomm, using the Snapdragon 8 Elite Gen 5 "for Galaxy".

This decision makes perfect sense from a market positioning standpoint. The Fold is the quintessential productivity device, the most expensive in the catalog, and Samsung knows that its target audience does not forgive any hesitation in performance. By keeping Qualcomm's best processor in these models, the brand avoids the growing pains of new architectures and ensures that its flagship continues to be seen as an infallible raw performance machine. 

Between cost-cutting and renewed confidence...Why would Samsung decide to swap a widely praised Snapdragon processor for an Exynos in one of its most popular phones? The answer lies in two fundamental pillars: economy and technical maturity.

Price control: With component costs rising year after year, using its own processor is the most effective way for Samsung to lower the final price of the device or, at least, maintain profit margins without inflating what you pay in the store.

-Real-world performance: Tests of the Exynos 2500 showed that, although benchmark numbers may lag slightly behind the competition, the day-to-day user experience is extremely fluid.

-Battery life: Vertical integration between hardware and software allows for power optimization that sometimes compensates for a lack of raw power.

-Independence: Reducing dependence on Qualcomm gives Samsung greater bargaining power and control over its release schedule.

What does this change in your next purchase...Ultimately, the big question is whether you'll notice a difference if you have a Galaxy Z Flip 8 with Exynos instead of Snapdragon. The truth is that Samsung has shown increasing confidence in its own silicon. If this change helps put a foldable phone in your pocket at a more affordable price, most users will hardly complain about a slightly lower score on a synthetic test they'll never even run.

The foldable landscape is changing, and Samsung has realized that to mass-market these devices, it can't be held hostage by prices imposed by third parties. Now we're waiting to see if this mixed strategy is confirmed at the official launch or if the brand decides to make the full leap to Exynos in the Flip line, leaving Snapdragon only for those who demand (and pay for) the luxury of the Fold.

As of mid-2026, Samsung is aiming to lower the price of its foldable phones—specifically the Galaxy Z Flip 8 and upcoming "Wide Fold" models—by adopting a strategy of component cost-cutting, diversified supply chains, and, in some cases, utilizing older, proven technology. Despite facing intense pressure from rising memory costs, Samsung is pushing to make foldables more accessible to maintain market share against competitors and Apple.

Strategies for lowering costs(below):

-Introducing "Fan Edition" (FE) foldables: Samsung is expanding its portfolio with lower-cost options, such as the rumored Galaxy Z Flip 8 FE, designed to offer the core folding experience at a reduced price point.

-Utilizing older display technology: To control costs, Samsung is reportedly opting to use older "M13" organic material panels for the Z Fold 8 and Z Flip 8, rather than the more expensive, newer "M14" materials found in the Galaxy S26 Ultra. This strategy helps avoid steep price increases caused by the high costs of the newest folding displays.

-Supply chain diversification & third-party Suppliers: In a major shift, Samsung is evaluating third-party display suppliers to break away from exclusive, higher-cost in-house components from Samsung Display. For example, replacing Samsung SDI batteries with alternatives from companies like Amperex Technology Limited (ATL) is considered a key step in reducing costs.

-Mixed chipset strategy: While the premium Galaxy Z Fold 8 is expected to use high-end Qualcomm Snapdragon processors, Samsung is planning to use its own Exynos 2600 chips for the Galaxy Z Flip 8 in many regions to mitigate high manufacturing costs.

-Increased production efficiency: As technology matures, Samsung is improving yield rates on its manufacturing lines, reducing the expense per unit.Enhanced Trade-in & Promotional Deals: Samsung heavily utilizes aggressive trade-in incentives, such as the $1,000 discounts seen with the Z Fold 7, to make the effective price comparable to traditional flagship phones.

Enhanced trade-in & promotional deals: Samsung heavily utilizes aggressive trade-in incentives, such as the $1,000 discounts seen with the Z Fold 7, to make the effective price comparable to traditional flagship phones.

Challenges affecting costs in 2026(below):

-Rising memory prices: Surging AI-related demand has caused RAM to account for over 30–40% of a smartphone's cost, putting massive pressure on Samsung to keep prices stable.

-Competitor pressure: Chinese manufacturers offering cheaper foldable alternatives have forced Samsung to innovate on design—making them thinner and lighter—while simultaneously trying to keep costs down.

mundophone

Wednesday, May 6, 2026


DIGTAL LIFE



Google Chrome silently installs a 4 GB AI model on your device(video)

Google is the latest company to generate negative headlines for its AI integration, due to its Gemini AI integration into Chrome announced back in September of last year. In the months since, the feature has rolled out to users' PCs, and since you have to opt out, most people are now using Chrome with AI features enabled by default. Privacy advocates and AI critics have taken note of this and are pointing fingers at a 4GB "weights.bin" file used for Google's local AI, Gemini Nano. Critics have also noted that the weights file restores itself upon deletion, and even on a brand-new Chrome install, will be automatically downloaded to a user's device.

That said, it's  surprisingly easy to disable AI features in Chrome, including permanently deleting the weights.bin file. In fact, I did so as soon as I noticed the Gemini button being added to my Chrome window, and right-clicking it to disable it. After that, you need to open up "chrome://flags" in your address bar, and find the "Enables optimization guide on device" flag. Disable it and the folder containing the weights.bin file will automatically be deleted, and it won't come back. The persistence of the file when manually disabled is most likely due to this conflicting browser flag, and Chrome repairing itself, and not malicious intent.

Two weeks ago I wrote about Anthropic silently registering a Native Messaging bridge in seven Chromium-based browsers on every machine where Claude Desktop was installed [1]. The pattern was: install on user launch of product A, write configuration into the user's installs of products B, C, D, E, F, G, H without asking. Reach across vendor trust boundaries. No consent dialog. No opt-out UI. Re-installs itself if the user removes it manually, every time Claude Desktop is launched.

This week I discovered the same pattern, executed by Google. Google Chrome is reaching into users' machines and writing a 4 GB on-device AI model file to disk without asking. The file is named weights.bin. It lives in OptGuideOnDeviceModel. It is the weights for Gemini Nano, Google's on-device LLM. Chrome did not ask. Chrome does not surface it. If the user deletes it, Chrome re-downloads it.

The legal analysis is the same one I gave for the Anthropic case. The environmental analysis is new. At Chrome's scale, the climate bill for one model push, paid in atmospheric CO2 by the entire planet, is between six thousand and sixty thousand tonnes of CO2-equivalent emissions, depending on how many devices receive the push. That is the environmental cost of one company unilaterally deciding that two billion peoples' default browser will mass-distribute a 4 GB binary they did not request.

This is, in my professional opinion, a direct breach of Article 5(3) of Directive 2002/58/EC (the ePrivacy Directive) [2], a breach of the Article 5(1) GDPR principles of lawfulness, fairness, and transparency [3], a breach of Article 25 GDPR's data-protection-by-design obligation [3], and an environmental harm of a magnitude that would be a notifiable event under the Corporate Sustainability Reporting Directive (CSRD) for any in-scope undertaking [4].

What is on the disk and how it got there...On any machine that has Chrome installed, in the user profile, sits a directory whose name is OptGuideOnDeviceModel. Inside it is a file called weights.bin. The file is approximately 4 GB. It is the weights file for Gemini Nano. Chrome uses it to power features Google has marketed under names like "Help me write", on-device scam detection, and other AI-assisted browser functions.

The file appeared with no consent prompt. There is no checkbox in Chrome Settings labelled "download a 4 GB AI model". The download triggers when Chrome's AI features are active, and those features are active by default in recent Chrome versions. On any machine that meets the hardware requirements, Chrome treats the user's hardware as a delivery target and writes the model.

The cycle of deletion and re-download has been documented across multiple independent reports on Windows installations [5][6][7][8] - the user deletes, Chrome re-downloads, the user deletes again, Chrome re-downloads again. The only ways to make the deletion stick are to disable Chrome's AI features through chrome://flags or enterprise policy tooling that home users do not generally have, or to uninstall Chrome entirely [5]. On macOS the file lands as mode 600 owned by the user (so it is deletable in principle) but Chrome holds the install state in Local State after the bytes are written, and as soon as the variations server next tells Chrome the profile is eligible, the download fires again - the architecture is the same, only the file permissions differ.

How I verified this on a freshly created Apple Silicon profile...Most of the existing reporting on this behaviour is from Windows users who noticed their disk filling up - useful, but Google could (and probably will) try to characterise those reports as anecdotes from non-representative configurations. So I went looking for a clean witness on a different platform.

The witness I found is macOS itself. The kernel keeps a filesystem event log called .fseventsd - it records every file create, modify and delete at the OS level, independent of any application logging. Chrome cannot edit it, Google cannot remotely reach it, and the page files that record the events survive the deletion of the files they reference.

I created a Chrome user-data directory on 23 April 2026 to run an automated audit (one of the WebSentinel 100-site privacy sweeps). The audit driver is fully Chrome DevTools Protocol - it loads a page, dwells for five minutes with no input, captures events, closes Chrome between sites - and the profile had received zero keyboard or mouse input from a human at any point in its existence. Every "AI mode" surface in Chrome was untouched - in fact every UI surface in Chrome was untouched, the audit driver only interacts with the document via CDP and the omnibox is never reached. By 29 April the profile contained 4 GB of OptGuideOnDeviceModel weights - and I knew it because a routine du -sh of the audit-profile directory caught it during a cleanup pass.

mundophone


TECH


Researcher explores the hidden science of pipe failure

How do aging cast iron pipes actually start leaking? The School of Mechanical, Aerospace and Civil Engineering's Edward John is the highlight in a new UK Water Industry Research (UKWIR) video looking at his Ph.D. research that uncovers how pressure changes in old cast iron pipes can cause cracks and leaks over time, with the findings enabling water companies to fix pipes before they burst completely.

Cast iron was the gold standard for plumbing for decades because of its balance of strength and manufacturability. However, it isn't invincible. With approximately 40% of the UK's water supply network composed of cast iron pipes installed during or before the 1960s, large parts of the existing network are reaching the end of their natural lifespan.

This aging infrastructure has led to a high failure rate, manifesting in frequent leaks and bursts, and is a major barrier for UK water companies who want to achieve their goal of halving water leakage by 2050.

The transition from a solid pipe to a leaking one is usually a slow, chemical corrosion process that happens from the outside in, followed by a terminal phase of crack initiation and growth over a few years.

Edward's Ph.D. research, seen in the UKWIR report, focuses on understanding the cracking part of the process, specifically looking at how pressure changes, such as at night when water usage is low and pressure is often higher, forcing small cracks to open up, leading to the leaks we see today.

This image shows a University of Sheffield researcher in a laboratory setting, likely working with fluid systems. Credit: University of Sheffield

Using controlled lab experiments, Edward studied the mechanical fatigue behind these leaks and discovered that reducing pressure temporarily closes micro leaks by easing pipe stress, but the cyclic pressure (the constant rising and falling of water pressure) causes the cracks to keep growing and the leak eventually becomes permanent.

By understanding these failure mechanics, the work allows utility companies to move beyond guesswork and target the specific pipes most likely to burst, which reduces the occurrence of new leaks, saves millions of gallons of water, and in turn reduces water bills for consumers.

The work has shown much promise with the potential of really changing how companies deal with underground pipes in the future.

Edward said, "We're hoping to carry out some follow-up research that will work towards having more of an implementable solution that's based on the kind of fundamental understanding from my Ph.D. The research will allow water companies to more fully understand pipe deterioration and proactively replace pipes in a targeted way.

"At the moment, I am working on two projects about pipe condition assessment—one is looking at measuring the thickness of cast iron pipes using acoustic sensor techniques to detect small leaks. The other is looking at sewer liner condition assessment, which is trying to find non-visible defects."

Provided by University of Sheffield  

Tuesday, May 5, 2026


DIGITAL LIFE


Novel approach to training AI saves energy, improves speed, and minimizes data sent across networks

In a novel attempt to improve how large language models learn and make them more capable and energy-efficient, Stevens Institute of Technology researchers have devised an algorithm that improves AI data sharing, boosts performance and reduces power consumption.

Large language models like ChatGPT are huge. Letting many people train them together without sharing users' private data—an approach called federated learning—is slow and inefficient. To collaborate, the models must share their updated versions of the entire data all the time—and that's a huge amount of information to exchange. This approach uses a lot of network bandwidth memory and is energy intensive. As a result, models can't be synchronized as often as necessary, resulting in outdated versions.

"It's too much data to share," says Stevens Ph.D. candidate Yide Ran, who was the driving force behind the effort to improve the process. "It's like sending in an entire encyclopedia when you only need to change a few entries. But you really don't need to do that."

Working together with his advisors, Zhaozhuo Xu, Assistant Professor of Computer Science at the School of Engineering who studies machine learning at Stevens Department of Computer Science, and Denghui Zhang, Assistant Professor of Information Systems and Analytics at the School of Business, Ran sought to improve how language models share their data.

The team built upon the previously known concept that effective learning in large language models is often driven by a surprisingly small but well-chosen subset of parameters. The result is a more agile, faster-working model that also uses less energy. The researchers named the model MEERKAT after the animal, known for its dexterity and speed.

The team outlined their findings in a paper titled "Mitigating Non-IID Drift in Zeroth-Order Federated LLM Fine-Tuning with Transferable Sparsity," which was presented at the 2026 International Conference on Learning Representations.

Instead of sharing the entire giant AI model, MEERKAT shares updates to only 0.1% of the model, which includes the most important parameters.

"So you are no longer sending the entire encyclopedia when only a few key definitions have changed," explains Zhang. That shrinks communications by over 1,000 times. "Updates that used to be gigabytes are now just a few megabytes," Zhang says.

MEERKAT's other efficiency secret is using a different error-checking approach. Standard AI training requires an intense mathematical process called backpropagation, which stands for backward propagation of errors, in which AI performs self-checks to avoid mistakes. Although it's a core algorithm used to train neural networks by minimizing the difference between predicted and actual outputs, backpropagation consumes huge amounts of memory and energy. MEERKAT simply tweaks the model slightly and checks the results, completely bypassing backpropagation.

Finally, small updates allow for more frequent synchronization of data, which is another breakthrough, as it keeps models up to date.

"Because updates are so tiny, data can now be sent back and forth more often," says Xu. "The result is a much better shared model."

This new approach substantially reduces computational and communication costs, helping make advanced AI adaptation more feasible for resource-constrained institutions, researchers say. Their work will also support more equitable deployment of AI in domains such as health care, education and cross-institutional collaboration, where centralized data collection is often difficult to achieve due to privacy and other issues.

Bypassing Backpropagation (MEERKAT Project) An innovative approach developed by researchers at the Stevens Institute of Technology introduced the MEERKAT system, which saves energy by avoiding the mathematically intensive process of backpropagation.

Innovation: The system makes small adjustments to the model and verifies the results, bypassing the complex error checks that typically drain memory and energy.

Benefit: In addition to saving energy, it reduces the cost of data communication, facilitating training on resource-limited local devices.


by: Stevens Institute of Technology


TECH


Microsoft yields to pressure and removes controversial advice about 32 GB of RAM

If you've recently built or bought a gaming PC, you're guaranteed to encounter the age-old PC gaming dilemma: how much RAM is really needed to run modern games without stuttering and frame rate drops? The answer to this question usually varies immensely depending on who you ask, the games you prefer, and your tolerance for potential performance issues. However, Microsoft recently decided to come out and give its own official "verdict" on the matter. The only problem? The gaming community didn't find the suggestion very amusing, and the tech giant was forced to quietly back down.

In an article published (and now defunct) on its official website, the owner of Windows and Xbox decided to establish what it considered the new golden rules for the ideal memory configuration in a computer dedicated to video games.

According to the company's text, 16 GB of RAM should only be seen as the "practical starting point" or the minimum baseline. Microsoft's real recommendation, which it boldly labeled the "no worries upgrade," focused on a hefty 32 GB of memory. The official justification even had some logical basis: having 32 GB gives you a gigantic margin of maneuver if you're the type of gamer who likes to have the game running while keeping Discord open to chat with friends, has your browser (like the resource-hungry Google Chrome) full of tabs with guides and tutorials, or uses streaming tools running in the background.

The corporate theory seemed very sensible on paper, but Microsoft quickly forgot a crucial and unavoidable detail: the finite budget of the average gamer.

As soon as the specialized portal Windows Latest discovered this advice article and shared it with the rest of the world, the reaction on social media, subreddits, and forums dedicated to hardware was immediate and relentless. Gamers didn't hold back on criticism, and many resorted to sarcasm to ridicule the company's recommendation.

The main complaint is very easy to understand. In a market where the latest graphics cards and top-of-the-line processors already cost a fortune, demanding or attempting to normalize gamers spending tens or hundreds of euros (or dollars) more just to double their RAM is an attitude that sounds completely disconnected from economic reality. The general consensus of the community was clear: asking people to loosen their purse strings to reach the 32 GB mark doesn't send a good message, especially when many are still struggling to assemble a basic machine.

Microsoft's "ninja blackout"...The outcome of this story is a classic case of corporate crisis management and damage control, executed in the quietest way imaginable. Faced with the avalanche of negative comments and ridicule from the gaming community, Microsoft implicitly agreed that it had shot itself in the foot with that publication.

The new foundation for gaming...In a post on its Learning Center, Microsoft published an article showing what constitutes "A good gaming computer," and what caught people's attention was the company's choice to set 32 ​​GB of RAM as the new standard for PCs.

In the text, the company explains that 32 GB of RAM is necessary to achieve "smooth gameplay." Since most current games consume around 16 GB of RAM during execution, although unrealistic for many, the company's thinking makes sense.

Microsoft comments that users often also use applications like Discord, browsers, or even streaming programs while playing, so it's necessary to have enough RAM to run everything smoothly.

The only problem with this situation is that, thanks to investments in generative artificial intelligence, the price of RAM has plummeted worldwide due to scarcity.

Furthermore, the company also commented that SSDs are being used more frequently, which also makes sense given the recommended specifications of most games, which always warn about the need for an SSD to run the titles.

What was the solution found? The company simply removed the article from its official website overnight, without leaving any trace, footnote, or apology. The maneuver was so quick and stealthy that not even the famous Wayback Machine (the digital archive that preserves web pages) managed to capture a copy of the original page for posterity, turning Microsoft's words into a veritable digital ghost.

The stark truth is that, regardless of this controversy, modern video games are effectively becoming increasingly demanding. Having 16 GB on your system still allows you to play the vast majority of the current catalog in a very decent way, but the inevitable transition to higher capacities is looming on the horizon. Microsoft's mistake wasn't a technical forecasting error, but rather a giant error in timing and empathy with your wallet.

mundophone

Monday, May 4, 2026


DIGITAL LIFE


Think online ads are harmless? They could be revealing your private life, say researchers

A new study has uncovered a significant and largely invisible privacy risk in the online advertising ecosystem: the ads you see may be enough to reveal sensitive personal information.

Researchers from the ARC Center of Excellence for Automated Decision-Making and Society (ARC ADM+S) at UNSW Sydney and QUT have demonstrated that artificial intelligence can assess personal attributes, including political preferences, education level, and employment status, based solely on the advertisements a person is shown online.

The study analyzed more than 435,000 Facebook ads seen by 891 Australian users, collected through the Australian Ad Observatory project—a signature project of the ARC ADM+S.

Using advanced large language models (LLMs), researchers found that:

-Personal traits could be inferred without access to browsing history or personal data

-Profiles could be built from short browsing sessions

-AI systems matched and sometimes exceeded human ability to infer personal characteristics

-The process was over 200 times cheaper and 50 times faster than human analysis.

The research shows LLMs can very quickly and cheaply assess online adverts being fed to individuals to predict a wide range of detailed personal information.

In a paper presented at the ACM Web Conference 2026, the researchers say, "Our results demonstrate that off-the-shelf LLMs can accurately reconstruct complex user private attributes.

"Critically, actionable profiling is feasible even within short observation windows, indicating that prolonged tracking is not a prerequisite for a successful attack."

Lead author Baiyu Chen, from UNSW, said the findings challenge common assumptions about online privacy.

"The key point is that the ads a person sees are not random. Advertising systems optimize delivery based on inferred profiles and behaviors, so the overall pattern of ads shown to a user can carry signals about traits such as gender, age, education, employment status, political preference, and broader socioeconomic position.

"Our study shows that LLMs can analyze those patterns and infer private attributes from ad exposure alone.

"These findings provide the first empirical evidence that ad streams serve as a high-fidelity digital footprint, enabling off-platform profiling that inherently bypasses current platform safeguards, highlighting a systemic vulnerability in the ad ecosystem and the urgent need for responsible web AI governance in the generative AI era.

"This work reveals a critical blind spot in web privacy: the latent leakage of user private attributes through passive exposure to algorithmic advertising."

A critical blind spot in privacy...By using AI to analyze ad content, the researchers—including Professor Flora Salim, Professor Daniel Angus, Dr. Benjamin Tag and Dr. Hao Xue—show that streams of ads act like highly detailed digital fingerprints, allowing private attributes to be reconstructed with surprising accuracy, often matching or even exceeding human judgment.

Crucially, the research shows this is not a theoretical risk. Profiles can be built quickly and at scale, even from short browsing sessions, and without long-term tracking. Even when predictions are not exact, they are often close enough to reveal meaningful insights about a person's life stage or financial situation.

How it could be exploited...While major platforms have restricted advertisers from targeting sensitive categories, the study shows that algorithmic ad delivery still encodes these traits indirectly and that this information can now be extracted using widely available AI tools.

This creates a new form of privacy risk where:

-Users do not actively share information.

-No hacking or platform-side access is required.

-Profiling can happen outside platform oversight.

The researchers warn that everyday tools such as browser extensions could be repurposed to quietly collect ads and build detailed user profiles—bypassing platform safeguards and leaving little trace.

In the paper, they say, "We identify browser extensions that abuse legitimate privileges as the potential primary vector for this attack. This scenario is severe due to its inherent stealth and scalability.

"Rather than distributing specialized malware, an adversary can opportunistically deploy this attack within the existing ecosystem of widely installed, benign functioning extensions, such as ad blockers, coupon finders, or page translators.

"These extensions legitimately require permissions to read web page content to function, providing a perfect cover for data harvesting."

Implications for policy and regulation...The findings suggest current privacy protections may not go far enough.

As AI tools make this kind of analysis easier and more accessible, the researchers argue that regulation must evolve to address not just data collection, but what can be inferred from the content people are exposed to.

Addressing this risk will require rethinking privacy frameworks to account for the hidden signals embedded in everyday online experiences—including the ads users passively consume.

"In terms of protection, users can reduce the risk by being cautious with browser extensions, limiting unnecessary permissions, and using available privacy and ad-personalization settings," said Chen.

"However, this is not something users can fully solve on their own, because the broader issue is systemic: people cannot easily opt out of the ad ecosystem altogether, so stronger platform safeguards are also needed."

The study draws on data from the Australian Ad Observatory, a citizen science initiative that collects ads seen by everyday users. It represents one of the largest real-world investigations into how AI can infer personal information from online advertising.

Online ads have gone from being just random ads to becoming a detailed mirror of your private life. Recent studies reveal that the simple stream of ads you receive — even without clicking on anything — can be used by Artificial Intelligence to reconstruct your personal profile with alarming accuracy.

How ads "snitch" on you... Unlike what many think, ads don't appear by chance. They are the end result of a complex system called Real-Time Bidding (RTB), an instant auction that takes place in the milliseconds it takes for a page to load.

Digital ad impressions: Researchers at UNSW Sydney have demonstrated that AI models (LLMs) can infer traits such as political preference, education level, employment status, and financial situation simply by analyzing the pattern of ads displayed to a user.

Surveillance without clicks: You don't need to interact with the ad to be exposed. "Passive exposure" to algorithmic content serves as a high-fidelity digital footprint that bypasses current platform protections.

Short sessions are enough: Months of monitoring are not necessary. Actionable profiles can be created from short browsing sessions, making the attack quick and cheap for malicious actors.

The role of "data brokers": Advertising systems fuel a billion-dollar surveillance industry:

Location sales: Precise location data is harvested through SDKs (software development kits) in common applications, such as weather or flashlight apps, and sold to marketing companies and even government agencies.

Exposure of sensitive groups: There have been confirmed cases of data brokers selling lists of people categorized as "pregnant women," "Hispanic churchgoers," or members of the LGBTQ+ community, often exposing individuals in vulnerable situations.

Provided by University of New South Wales 

  QUALCOMM New Snapdragon processors: a giant leap for affordable cell phones When we think about major technological innovations in cell ph...