Sunday, October 12, 2025


TECH


IBM bets on agentic AI and cloud unification

At its annual developer event, TechXchange 2025, IBM unveiled a suite of new software and infrastructure capabilities designed with a clear goal: taking companies beyond the artificial intelligence experimentation phase. The announcements focus on solving some of the biggest obstacles to large-scale AI adoption, such as the fragmentation of hybrid cloud environments and the complexity of governing autonomous systems.

The company's strategy is based on three fundamental pillars: the orchestration of agentic AI with WatsonX Orchestrate, the unification of infrastructure management with Project Infragraph (the first major synergy from the HashiCorp acquisition), and the acceleration of developer productivity with a new AI-native development environment, Project Bob.

The concept of "agentic AI"—AI systems capable of performing complex tasks proactively and autonomously—was at the heart of the announcements. IBM positions its WatsonX Orchestrate product as the brain of this new era, an agnostic platform capable of orchestrating multiple agents and tools.

The big news is AgentOps, an integrated governance and observability layer. In practice, AgentOps functions as a control tower for AI agents. IBM uses an HR agent as an example: without AgentOps, the IT team has no visibility into how the agent is enforcing internal policies or handling sensitive data; with AgentOps, all actions are monitored in real time, allowing anomalies to be corrected immediately.

To facilitate the creation of these agents, IBM announced two improvements:

-Agentic Workflows: Allow developers to sequence multiple agents and tools through standardized, reusable flows, avoiding the fragility of custom scripts.

-Integration with Langflow: A visual drag-and-drop tool that allows non-technical teams to build an AI agent in minutes. The integration is expected to be available at the end of October.

Following its recent acquisition of HashiCorp, IBM unveiled Project Infragraph, its answer to the complexity of managing multicloud environments. The project aims to replace the proliferation of monitoring tools with a unified, intelligent control plane.

Currently, when a critical vulnerability is discovered, the remediation process is manual and time-consuming. With Project Infragraph, IBM promises a centralized, real-time view of the entire infrastructure and security posture, both inside and outside the HashiCorp Cloud Platform (HCP). The platform will allow for the instant identification of all components affected by a vulnerability, without the need for manual processes.

Project Infragraph will be available as a feature of HCP, with a private beta program scheduled for December. In the future, IBM plans to expand its connectivity to other solutions in its portfolio, such as Red Hat Ansible, OpenShift, and WatsonX Orchestrate. The Future of Development with Project Bob

IBM also unveiled the first preview of an ambitious new tool for developers: Project Bob. Described as a "first-generation AI-powered integrated development environment (IDE)," its goal is to go beyond current code assistants.

Project Bob is designed to be an active partner for developers throughout the development lifecycle, from writing and testing code to large-scale application modernization and security assurance. Its capabilities include:

Application Modernization: Automating system updates and context-aware code refactoring across massive code bases.

Intelligent Code Generation: The assistant understands enterprise architecture patterns, security, and compliance requirements.

Secure Development: Integrating vulnerability scans and remediation directly into the developer's workflow ("shift-left").

To combat the risk of vendor lock-in, IBM has strengthened its commitment to an open AI ecosystem. The main development in this field was the announcement of a new partnership with Anthropic, a leading player in language model development.

IBM will integrate Anthropic's Claude models directly into some of its software products, starting with Project Bob. This collaboration embodies IBM's strategy of offering flexibility to its clients, allowing them to choose the most appropriate AI models for each task, rather than limiting them to their own ecosystem.

Conclusion...The TechXchange 2025 announcements paint a clear picture of IBM's strategy for the era of enterprise AI. Rather than focusing solely on language models, the company is building an integrated platform that spans the entire AI lifecycle, from the infrastructure that supports it (Project Infragraph), to the tools that create it (Project Bob), to the orchestration and governance of the agents that execute it (WatsonX Orchestrate). It's a holistic and pragmatic approach focused on solving the real problems of complexity, security, and fragmentation that companies face when trying to operationalize artificial intelligence.

mundophone

 

DIGITAL LIFE


Why AI won't turn mediocre people into geniuses

For publicist Gal Barradas, there's a widespread belief that artificial intelligence will help mediocre people succeed. "It might help, but mediocre people will continue to be mediocre. Meanwhile, creative people will become more creative."

With 30 years of experience in marketing and communications, Gal Barradas(in the image below next to) has held management positions at agencies such as W/Brasil, AgênciaClick, F/Nazca, F.biz, and BETC. She has participated in major Brazilian advertising projects, such as the Skol campaign ("the beer that goes down smoothly"), and has a few Cannes Lions under her belt.

In 2018, she founded Gal Barradas Brand & Venture, focused on accelerating startups in different sectors through marketing and technology. She is currently responsible for the Growth strategy at Rio Bravo Investimentos, in addition to leading the Marketing, Sales, and Investor Relations departments.

A scholar of the relationship between marketing and technology, she launched the book "New Questions, Different Answers" in 2018 – a new edition is promised soon. "Throughout my career, I've experienced wonderful moments and participated in major market cases," said Gal in an interview with Época NEGÓCIOS. "Today, marketing is much more complex, but I don't mind. We can create so much more, and there are more ways to reach people. I find it all very entertaining." Check out the main excerpts from the conversation below.

ÉPOCA NEGÓCIOS: What have been the biggest impacts of technology on marketing in recent years?

GAL BARRADAS: I would venture to say that marketing and communication have been the areas most impacted by technology, because people have all become full-time communicators. We all write, photograph, film, publish, comment, and create meaning for brands. We speak well, we speak badly, we share, we recommend. I wrote a book about technology and marketing in 2018, and everything I said there still applies. These are structural issues, related to brand meaning and positioning. But many new things have emerged, which is why I'm preparing volume 2. One of the topics I'll cover, of course, is the impact of artificial intelligence on the segment.

ÉPOCA NEGÓCIOS: How do you assess the magnitude of this impact?

GAL BARRADAS: In the history of marketing, time has brought some turning points. The difference is that these days they happen more quickly. If we stop to think specifically about technology, marketing, and communication, there have been some very visible turning points. When companies began to understand that they could provide services through digital environments, with banks as pioneers. When Facebook became the world's largest social network in 2009, and soon after, it was discovered that it was possible to use the network as a means of selling not only one's image, but also products and services. And when e-commerce sales exploded in 2012.

Now we're experiencing another one of those moments, with the popularization of artificial intelligence tools. It's possible to do art direction, illustrations, and texts using generative AI. But there's a point of caution here. There's a widespread belief that artificial intelligence will help the mediocre succeed. It may even help, but the mediocre will remain mediocre. Meanwhile, creatives will become more creative because they know how to use the tool. When you have highly talented people, they learn to use artificial intelligence like a DJ composing a song. They use the most diverse forms of sound generation and manage to create something unique. So, when you have a well-designed command, art direction for an illustration, you'll create an extraordinary image. Only an artist can do that. Whether they did it with a pilot brush or using artificial intelligence tools doesn't matter. What matters is the end result. This concerns authority, creativity, and originality, and this will not be lost. It's a result that has references, tone, and style. All of this is human intelligence.

ÉPOCA NEGÓCIOS: When does it make sense to use artificial intelligence?

GAL BARRADAS: Every time it reduces the professional's workload and frees them to create, think, and plan. AI is not capable of developing a strategy. It analyzes what's available in the world and can provide, in a very rich way, data analysis, correlations, and even trends. When I say it's not capable of creating a strategy, it's because we are the ones living in the real world, who have the experience and ability to understand why it produced those results. We are capable of understanding the historical perspective that brought us here. Furthermore, we must also be able to make a historical projection forward, of how things will develop, taking into account global, local, social, and economic aspects. And including those that are happening now and are not available for artificial intelligence analysis.

ÉPOCA NEGÓCIOS: How can we preserve brand credibility when users notice and reject the use of AI in communications? 

GAL BARRADAS: To begin with, brands should mention when the content was generated by AI, because it's still very new; we're in a learning process, and people need to know. Now, if anyone says they know what the impact of artificial intelligence will be on the markets, they're lying. Nobody knows. For that to happen, before we get here, with artificial intelligence at everyone's fingertips, it would be necessary for human beings to have made a cognitive leap, not only in terms of education, but also in ethics, legislation, market regulation—these tools that make a society exist in an organized way. Unfortunately, that didn't happen. And human beings, by nature, use things for both good and evil. The tool was made available to everyone, without this having been discussed beforehand. So people will have to learn by using, suffering, creating.

ÉPOCA NEGÓCIOS: Some say that the use of hyper-realistic avatars, for example, takes away the authenticity and spontaneity of what was created...

GAL BARRADAS: I agree. I was watching campaigns made with AI the other day. Some are very poorly done, full of strange characters. This really drives customers away, who start to find everything very artificial. And the goal of artificial intelligence is not to be artificial. It's to be natural. So, when it's misused and doesn't achieve its true objectives, I criticize it too.

ÉPOCA NEGÓCIOS: What other marketing trends do leaders need to watch out for today?

GAL BARRADAS: It's very important for companies to see themselves as communities, so they can attract people through shared interests. They need to position themselves as groups of people, not as mere broadcasters of sales communications. Communities form around very defined ideas, with a very clear purpose. The brand must be available and open to listening and conversation. It needs to dedicate time to this, and provide some kind of reward for that person. "I heard what you said, so I improved this, I created this other thing." The person needs to feel recognized for being part of the community. This way, you create mechanisms of consideration, ideally, of consumer loyalty to the brand. And it's these mechanisms that lead people to make a transaction with that brand, product, or service. Nike does this very well, and so does Adidas. Awaken this sense of community, listen to what people have to say, co-create. And maintain this conversation constantly, every day. Tell stories, but also practice social listening.

ÉPOCA NEGÓCIOS: Communication today is fragmented across many channels. How should brands deal with this?

GAL BARRADAS: You need to be clear about your goals and how much time you have to achieve them. From there, you develop your strategy. In the past, when we didn't have all this media fragmentation, companies thought: I'll put this much on broadcast TV, this much on home media, this much on radio, this much in magazines. Today, there are many channel possibilities, so you need to be more precise in distribution. On the other hand, today we have the right tools for this, and the data to achieve this engagement. This way, we achieve deeper segmentation.

ÉPOCA NEGÓCIOS: And this is linked to hyper-personalization, right?

GAL BARRADAS: Today, the term we use most is massive super-segmentation. Because before, mass communication was broadcast. You had a message, put it on television, and spoke to everyone at the same time. Today, digital tools allow you to speak to thousands of people in a segmented way. But to do that, you need access to data. There's media data, which is anonymous. With it, you can speak by clusters, by groups. But the most important thing for brands is what we call first-party data, your customer database, which allows you to talk to them about products and services. Because they are the ones who form your community.

ÉPOCA NEGÓCIOS: How do you see the relationship between brands and influencers evolving?

GAL BARRADAS: Years ago, everything was very new; people didn't really know how it worked. But today, we know it very well, so there's no room for error. I see a lot of people using influencers for everything. I don't agree with that. Influencers are endorsement tools. You want to leverage that person's prestige with consumers to influence decisions toward your brand. If you have a new brand that no one knows about, this strategy won't work. Another important fact: for an influencer to deliver results, they must have some real connection to the company or share the same values. There's no point in hiring someone who has a large audience but has nothing in their content that relates to that brand. You can't buy that person's audience. You have to buy that person's truth. This is something that has been revealed over time, but by now everyone should know.

Reporter: Marisa Adán Gil, Epoca magazine, Brazil

Saturday, October 11, 2025

 

DIGITAL LIFE


The invisible hand of big tech

When Scottish economist Adam Smith, known as the father of liberalism, spoke in the 18th century about "the invisible hand of the market," he predicted that by seeking their own benefit, individuals could benefit society as a whole. But Smith also warned that merchants often collude and manipulate the rules to their advantage, and that the state must prevent monopolies and guarantee public goods. Today, it is clear that, if left unchecked, Big Tech and other companies act as Smith feared: they shape laws, co-opt governments, and manipulate public opinion for their own benefit.

In the face of legislative advances, one sector has been prominent in anti-regulation lobbying efforts worldwide: Big Tech, a select group of billion-dollar companies such as Meta (owner of Facebook, WhatsApp, and Instagram), Alphabet (owner of Google), Amazon, Microsoft, and Apple – known as the "Big Five" – and others such as China's ByteDance (owner of TikTok), Argentina's MercadoLibre, and new players in the artificial intelligence race, such as OpenAI.

Together, these tech giants have a greater impact on every aspect of people's lives than many governments. However, unlike governments, whose goal is to serve the public and are accountable to it, Big Tech aims to maximize profits and be accountable to its shareholders.

There is little data on how they influence legislation. Today, Big Tech is the sector that spends the most on lobbying in the European Union, where they are required to declare their investments. In 2024, the sector spent €67 million, a 57% increase since 2020. In the United States—where they are also forced to publicize their lobbying spending—they spent $61 million in the same year, a 13% increase compared to 2023. To gain favor with the Trump administration, companies like Amazon, Meta, Google, and Microsoft each donated $1 million to the inauguration ceremony committee, and their CEOs lined up during the event.

For the first time, a collaborative, cross-border investigation identified nearly 3,000 Big Tech lobbying efforts in parliaments and governments around the world, which can be accessed in this interactive database. We also documented lawsuits and bills involving the rules of the game in the tech industry.

Making Big Tech's "invisible hand" visible is a task that the organizations participating in this project consider urgent. Therefore, they all make a collective disclaimer about funding received from technology companies currently or in the past.

https://apublica.org/especial/a-mao-invisivel-das-big-techs/

mundophone


TECH



Surfshark launches world's first 100Gbps VPN servers

VPN competition has been fierce in recent years. Major brands vie for consumer attention with promises of increased privacy, more server locations, and, above all, faster speeds. Today, Surfshark has taken a giant leap forward on this last front, announcing the launch of the world's first 100Gbps VPN servers.

This is a monumental technological leap, representing a tenfold increase in capacity compared to the current 10Gbps standard, which most major VPN providers, such as NordVPN and ExpressVPN, have adopted in recent years. With this move, Surfshark isn't just trying to be faster; it's redefining what's technically possible and preparing its network for the future of the internet.

Before you imagine that your download speed will magically skyrocket to 100Gbps, it's crucial to understand what this number means. The advertised speed doesn't refer to each user's individual speed, but rather to the server's total bandwidth capacity.

Think of a VPN server like a highway. The server's bandwidth is the number of lanes on that highway. If too many cars (users) try to use a highway with few lanes at the same time, the result is a traffic jam (congestion), and everyone's speed slows down.

Until now, the industry-standard highway had 10 lanes (10Gbps). What Surfshark has done is open the first 100-lane highway.

The benefit to you, as a user, isn't having a car that runs at 100Gbps, but rather having the guarantee that, even if the highway is full of other cars, there will always be a free lane for you. This translates into a much more stable, consistent, and faster connection, especially during peak hours (at night, for example), when VPN servers are under greater pressure. Surfshark's justification for this massive investment is the future. Bandwidth-hungry activities like 4K (and soon 8K) video streaming, competitive online gaming, virtual reality, and remote work involving large file transfers are becoming the norm.

To prevent your VPN from becoming a bottleneck preventing you from enjoying these experiences, server capacity must grow to keep up with demand. By making this leap to 100Gbps, Surfshark is "future-proofing," ensuring its network will have the necessary capacity for years to come.

This is where expectations need to be tempered. This isn't an update that will affect your connection tomorrow, unless you live near Amsterdam.

For now, the launch is limited to just a few servers in this Dutch city. Surfshark hasn't given any indication of when it plans to expand this technology to other major locations, such as the United States, Asia, or the rest of Europe.

In practice, this announcement should be seen more as a proof of concept and a powerful marketing ploy than as a feature that most users will be able to take advantage of in the short term. Surfshark is demonstrating its technological capabilities and establishing itself as a leader in innovation.

According to Donatas Budvytis, Chief Technology Officer at Surfshark, this change is happening due to several factors, including increased devices per household requiring higher network capacity to perform large software updates and ensure higher bitrates.

“With 10 times the headroom of 10Gbps, we can reduce congestion and maintain consistent speeds, even during high traffic spikes. This is especially important as the demand for higher network capacity and the number of online devices per household continue to grow rapidly. Also, VPN services should not become a bottleneck and have to be prepared for future technologies like augmented reality glasses or any other virtual reality headsets, which will depend on real-time data streaming and fast connections,” explained Budvytis.

Surfshark's new 100Gbps servers allow VPN technology to be future-proof and ready for the growing demand when the shift to higher-capacity hardware happens.

Increased bandwidth also reduces the need for throttling or deprioritizing traffic, allowing users to get closer to their maximum internet speeds more often, even when backing up heavy documents to the cloud or downloading a game.

“100Gbps hardware enables faster encryption on modern CPUs, more intelligent software paths, and improved load distribution. This results in consistently high speeds, greater stability, and the necessary capacity to handle future bandwidth-intensive applications,” said Budvytis.

For this solution, Surfshark has chosen the Amsterdam location due to its impressive internet exchange (AMS-IX), which handles over 14 trillion bits per second, making it one of the world's largest internet exchanges by traffic volume. To put this into perspective, that’s roughly 1.75 terabytes of data every second, ~560,000 simultaneous 4K streams, equivalent to about 7.5 million people watching TikTok videos simultaneously, or around 63 million people playing Fortnite at once.

The pressure on the competition...Still, the impact of this news on the industry will be immediate. Providers like NordVPN, ExpressVPN, and Proton VPN, which have invested millions in recent years to modernize their networks from 1Gbps to 10Gbps, now see Surfshark leapfrogging them an entire generation ahead.

The pressure to stay ahead is now immense. Although the reach is limited, the title of "first and only VPN with 100Gbps servers" is a formidable marketing tool, which will certainly force competitors to accelerate their own development plans. The race to 100Gbps has officially begun, and Surfshark has fired the starting gun.

mundophone

Friday, October 10, 2025

 

DIGITAL LIFE


Science's warning about the impact of letting artificial intelligence think for us

Researchers warn: the more we delegate to artificial intelligence, the more we risk losing something essentially human—the ability to reflect and understand the world autonomously. Science is beginning to reveal the consequences of a future where machines make decisions for us.

We live in an era where technology not only makes life easier, but also shapes our way of thinking. Artificial intelligence, present in schools, businesses, and even in everyday conversations, raises a troubling question: what happens when we stop thinking for ourselves and start relying on algorithms to interpret reality?

During a meeting of the Organization of Ibero-American States (OEI), neuroscientist Florencia Labombarda, a researcher at CONICET, issued a stark warning: blind trust in artificial intelligence may be altering the way we reason.

According to her, humans have a natural tendency to delegate decisions to authority figures—and today, algorithms have taken over that role. "We've turned AI into a source of authority. This is a bias we ourselves have created," Labombarda stated during the CONEXOS series. This transfer of cognitive power may seem harmless, but it has profound implications for intellectual autonomy.

The silent impact from childhood...One of the most sensitive points highlighted by the researcher concerns new generations. The excessive use of AI-based tools, especially among children and adolescents, can harm the development of critical thinking. "The brain loses training when we let machines think for us," she warned.

The key concept, according to Labombarda, is metacognition—the ability to understand how and why we think. Teaching young people to use AI as a support, not a substitute for reflection, is essential to prevent the atrophy of their own reasoning. Without this balance, we run the risk of creating generations that rely more on automatic responses than on personal analysis.

The dangers, however, extend beyond the cognitive realm. Labombarda also drew attention to the emotional effects of interacting with machines that simulate empathy. Many AI tools are programmed to please, but they don't feel or understand. "It's crucial that children understand that there's not a person on the other end, but an algorithm," she emphasized.

This illusion of companionship can create an emotional deficit: believing that there's emotional reciprocity in a conversation that, in reality, is mediated by codes and calculations. Over time, this can weaken social skills and the perception of genuine human connection.

Recovering the human in the digital world...Awarded the title "Scientists Who Count 2023," Labombarda reinforces the need to revalue real connections in an era dominated by screens and artificial intelligence. "We've already lost the first game with social media, but we can still win the second with AI," she stated. The final message is simple but urgent: artificial intelligence can be a powerful ally, as long as it doesn't replace what makes us human—our ability to think, question, and feel.

Coexistence, not replacement...The scenario in which AI "thinks for us" is not ideal. The most promising and ethical future involves a collaboration between humans and AI, where artificial intelligence serves as a tool to augment our capabilities, rather than replace them. The key is to use AI to automate routine tasks, while humans retain control over critical thinking, creativity, ethics, and empathy. The question is not whether AI will think for us, but rather how we can use it responsibly to advance human development, without losing the qualities that make us unique.

mundophone


TECH


Japanese discovery could eliminate the biggest obstacle to quantum computing

Researchers at Osaka University have developed a groundbreaking technique that promises to solve one of the biggest challenges in quantum computing: quantum noise. The breakthrough, based on a new way of preparing "magic states," could accelerate the arrival of truly functional and powerful quantum computers.

Quantum computing promises a monumental leap in processing power—but it still faces a persistent enemy: quantum noise. This interference is so sensitive that even a slight temperature variation or a deflected photon can destabilize an entire quantum computer.

"Even the smallest disturbance can render the system useless," explained Tomohiro Itogawa, a researcher at Osaka University, in an interview with Science Daily. "Noise is, without a doubt, the number one enemy of quantum computers."

It was precisely in trying to overcome this obstacle that the Japanese team arrived at an innovation that could redefine the future of the sector. To understand this breakthrough, one must understand the concept of “magic states”—the fundamental structures used by quantum computers to perform complex calculations with high precision.

These states serve as the building blocks of quantum logic. The problem is that preparing them with sufficient fidelity has always been extremely difficult and costly, requiring thousands of qubits (the basic units of quantum information).

This is where the Osaka team innovated. Instead of using traditional methods, they created a technique called “zero-level magic state distillation”—a process that acts directly at the physical level of the qubits, drastically reducing the need for intermediate correction layers.

“We wanted to speed up the preparation of high-fidelity states,” explained Keisuke Fujii, co-author of the study. “Our approach works much more directly and efficiently.”

A quantum leap in efficiency...The impact of the method was surprising. Simulations showed a dozenfold reduction in time and space overhead compared to conventional techniques, as well as a significant increase in accuracy. In practice, this means it will be possible to build larger, more stable, and much faster quantum computers. By optimizing the generation of "magic states" at the most basic hardware level, the process eliminates bottlenecks that previously impeded the scalability of these machines.

This advance isn't just technical—it marks the transition from theory to actual quantum engineering, where the priority is to build fault-tolerant systems capable of functioning even with environmental interference.

Enemy number one: quantum noise...Quantum noise is one of the biggest problems in the field because qubits are extremely fragile. Any vibration, magnetic field variation, or external particle can alter their states and corrupt calculations.

The new technique developed in Osaka makes these qubits more resistant to noise, bringing scientists closer to their ultimate goal: creating fault-tolerant quantum computers capable of operating reliably for long periods.

It's the kind of robustness needed to make quantum computing practical—not just experimental. 

What changes with this breakthrough...The implications are enormous. By overcoming noise and reducing system complexity, the technology could accelerate the arrival of commercial quantum computers—machines capable of solving calculations millions of times faster than current computers.

This power promises to revolutionize entire sectors.

Finance: portfolio optimization, economic forecasts, and risk simulations in seconds.

Biotechnology: discovery of new drugs and simulation of molecules at scales currently impossible.

Climate and energy: accurate prediction of complex phenomena and optimization of power grids.

Furthermore, the study reinforces the race for quantum supremacy, with Japan, the United States, China, and Europe investing heavily to dominate the field that could redefine global technological power.

The Osaka University discovery not only takes quantum theory a step further—it brings the world closer to the reality of functional quantum supercomputers. If "noise" was enemy number one, perhaps the Japanese have finally found the way to silence it.

mundophone

Thursday, October 9, 2025


TECH


Geothermal energy has huge potential to generate clean power, including from used oil and gas wells

As energy use rises and the planet warms, you might have dreamed of an energy source that works 24/7, rain or shine, quietly powering homes, industries and even entire cities without the ups and downs of solar or wind—and with little contribution to climate change.

The promise of new engineering techniques for geothermal energy—heat from Earth itself—has attracted rising levels of investment to this reliable, low-emission power source that can provide continuous electricity almost anywhere on the planet. That includes ways to harness geothermal energy from idle or abandoned oil and gas wells. In the first quarter of 2025, North American geothermal installations attracted US$1.7 billion in public funding—compared with $2 billion for all of 2024, which itself was a significant increase from previous years, according to an industry analysis from consulting firm Wood Mackenzie.

As an exploration geophysicist and energy engineer, I've studied geothermal systems' resource potential and operational trade-offs firsthand. From the investment and technological advances I'm seeing, I believe geothermal energy is poised to become a significant contributor to the energy mix in the U.S. and around the world, especially when integrated with other renewable sources.

A May 2025 assessment by the U.S. Geological Survey found that geothermal sources just in the Great Basin, a region that encompasses Nevada and parts of neighboring states, have the potential to meet as much as 10% of the electricity demand of the whole nation—and even more as technology to harness geothermal energy advances. And the International Energy Agency estimates that by 2050, geothermal energy could provide as much as 15% of the world's electricity needs.

Why geothermal energy is unique...Geothermal energy taps into heat beneath Earth's surface to generate electricity or provide direct heating. Unlike solar or wind, it never stops. It runs around the clock, providing consistent, reliable power with closed-loop water systems and few emissions.

Geothermal is capable of providing significant quantities of energy. For instance, Fervo Energy's Cape Station project in Utah is reportedly on track to deliver 100 megawatts of baseload, carbon-free geothermal power by 2026. That's less than the amount of power generated by the average coal plant in the U.S., but more than the average natural gas plant produces.

But the project, estimated to cost $1.1 billion, is not complete. When complete in 2028, the station is projected to deliver 500 megawatts of electricity. That amount is 100 megawatts more than its original goal without additional drilling, thanks to various technical improvements since the project broke ground.

And geothermal energy is becoming economically competitive. By 2035, according to the International Energy Agency, technical advances could mean energy from enhanced geothermal systems could cost as little as $50 per megawatt-hour, a price competitive with other renewable sources.

Types of geothermal energy...There are several ways to get energy from deep within Earth. 

Hydrothermal systems tap into underground hot water and steam to generate electricity. These resources are concentrated in geologically active areas where heat, water and permeable rock naturally coincide. In the U.S., that's generally California, Nevada and Utah. Internationally, most hydrothermal energy is in Iceland and the Philippines.

Some hydrothermal facilities, such as Larderello in Italy, have operated for over a century, proving the technology's long-term viability. Others in New Zealand and the U.S. have been running since the late 1950s and early 1960s.

Enhanced geothermal systems effectively create electricity-generating hydrothermal processes just about anywhere on the planet. In places where there is not enough water in the ground or where the rock is too dense to move heat naturally, these installations drill deep holes and inject fluid into the hot rocks, creating new fractures and opening existing ones, much like hydraulic fracturing for oil and gas production.

A system like this uses more than one well. In one, it pumps cold water down, which collects heat from the rocks and then is pumped back up through another well, where the heat drives turbines. In recent years, academic and corporate research has dramatically improved drilling speed and lowered costs.

Ground source heat pumps do not require drilling holes as deep, but instead take advantage of the fact that Earth's temperature is relatively stable just below the surface, even just 6 or 8 feet down (1.8 to 2.4 meters)—and it's hotter hundreds of feet lower.

These systems don't generate electricity but rather circulate fluid in underground pipes, exchanging heat with the soil, extracting warmth from the ground in winter and transferring warmth to the ground in summer. These systems are similar but more efficient than air-source heat pumps, sometimes called minisplits, which are becoming widespread across the U.S. for heating and cooling. Geothermal heat pump systems can serve individual homes, commercial buildings and even neighborhood or business developments.

Direct-use applications also don't generate electricity but rather use the geothermal heat directly. Farmers heat greenhouses and dry crops; aquaculture facilities maintain optimal water temperatures; industrial operations use the heat to dehydrate food, cure concrete or other energy-intensive processes. Worldwide, these applications now deliver over 100,000 megawatts of thermal capacity. Some geothermal fluids contain valuable minerals; lithium concentrations in the groundwater of California's Salton Sea region could potentially supply battery manufacturers. Federal judges are reviewing a proposal to do just that, as well as legal challenges to it.

Researchers are finding new ways to use geothermal resources, too. Some are using underground rock formations to store energy as heat when consumer demand is low and use it to produce electricity when demand rises.
Some geothermal power stations can adjust their output to meet demand, rather than running continuously at maximum capacity.
Geothermal sources are also making other renewable-energy projects more effective. Pairing geothermal energy with solar and wind resources and battery storage are increasing the reliability of above-ground renewable power in Texas, among other places.

And geothermal energy can power clean hydrogen production as well as energy-intensive efforts to physically remove carbon dioxide from the atmosphere, as is happening in Iceland.

Geothermal potential in the US and worldwide...Currently, the U.S. has about 3.9 gigawatts of installed geothermal capacity, mostly in the West. That's about 0.4% of current U.S. energy production, but the amount of available energy is much larger, according to federal and international engineering assessments. And converting abandoned oil and gas wells for enhanced geothermal systems could significantly increase the amount of energy available and its geographic spread.

One example is happening in Beaver County, in the southwestern part of Utah. Once a struggling rural community, it now hosts multiple geothermal plants that are being developed to both demonstrate the potential and to supply electricity to customers as far away as California.

Those projects include repurposing idle oil or gas wells, which is relatively straightforward: Engineers identify wells that reach deep, hot rock formations and circulate water or another fluid in a closed loop to capture heat to generate electricity or provide direct heating. This method does not require drilling new wells, which significantly reduces setup costs and environmental disruption and accelerates deployment.

There are as many as 4 million abandoned oil and gas wells across the U.S., some of which could shift from being fossil fuel infrastructure into opportunities for clean energy.

Challenges and trade-offs...Geothermal energy is not without technical, environmental and economic hurdles. Drilling is expensive, and conventional systems need specific geological conditions. Enhanced systems, using hydraulic fracturing, risk causing earthquakes. Overall emissions are low from geothermal systems, though the systems can release hydrogen sulfide, a corrosive gas that is toxic to humans and can contribute to respiratory irritation. But modern geothermal plants use abatement systems that can capture up to 99.9% of hydrogen sulfide before it enters the atmosphere.

And the systems do use water, though closed-loop systems can minimize consumption.

Building geothermal power stations does require significant investment, but its ability to deliver energy over the long term can offset many of these costs. Projects like those undertaken by Fervo Energy show that government subsidies are no longer necessary for a project to get funded, built and begin generating energy.

Despite its challenges, geothermal energy's reliability, low emissions and scalability make it a vital complement to solar and wind—and a cornerstone of a stable, low-carbon energy future.

Provided by The Conversation

TECH IBM bets on agentic AI and cloud unification At its annual developer event, TechXchange 2025 , IBM unveiled a suite of new software and...