Wednesday, December 31, 2025

 

DIGITAL LIFE


How to Learn to Spot Flaws in AI-Created Faces

Human faces have always been one of the most powerful signs of trustworthiness. But in the age of artificial intelligence, this intuition is being put to the test. Fake profiles, digital scams, and invented identities use increasingly realistic images, difficult to distinguish from real ones. Now, British research suggests something surprising: you don't need to be an expert or spend hours training. Just a few well-directed minutes are enough to spot what previously went unnoticed.

In recent years, AI image generation tools have evolved rapidly. Software capable of creating hyper-realistic faces is available with just a few clicks, allowing anyone to produce convincing images, even without technical knowledge. This has amplified a silent problem: the growing difficulty in differentiating real faces from artificial ones.

Researchers from four universities in the United Kingdom decided to investigate the extent to which ordinary people can make this distinction—and, above all, whether this ability can be improved quickly. To do this, they gathered more than 600 volunteers and tested their ability to identify real human faces and images generated by one of the most advanced systems available at the time.

The initial results revealed a clear limitation. Even individuals with good natural facial recognition skills got it right less than half the time. Participants with abilities considered typical had even lower rates, showing how AI can deceive the human eye with relative ease.

The unexpected impact of just five minutes of guidance...The most revealing part of the study came later. The volunteers underwent a short training session, lasting approximately five minutes, focused on teaching where artificial intelligence still tends to "go wrong." Nothing complex or technical: just visual guidance and practical examples.

After this brief training, the results changed significantly. People with advanced skills began to correctly identify most of the artificial faces. The average participants also showed a significant leap in accuracy, drastically reducing the number of errors.

The secret was not in learning algorithms or understanding how AI works, but in changing the focus of the gaze. The training taught people to pay attention to specific details that machines still have difficulty reproducing perfectly — small inconsistencies that, once noticed, become difficult to ignore.

The details that reveal an artificial face...Among the main signs pointed out by the researchers are slightly misaligned teeth, unnatural hairlines, ears with strange shapes, or accessories that don't make anatomical sense. In many cases, the face seems "too perfect" as a whole, but fails in isolated details.

These errors often go unnoticed at a quick glance, especially on social media, where images are consumed in seconds. The training showed that slowing down observation and knowing exactly where to look makes all the difference.

According to the study's authors, this type of guidance is becoming increasingly urgent. Computer-generated faces are already used to create fake profiles, deceive identity verification systems, and lend credibility to online scams. The more realistic these images become, the greater the risk for ordinary users.

Digital security begins with the gaze...The researchers emphasize that this is not a problem restricted to technology experts. On the contrary: anyone who uses social media, messaging apps, or online services is potentially exposed. Therefore, simple and quick training methods can have a direct impact on everyday digital security.

The study also suggests that combining this type of guidance with people who already possess high natural facial recognition skills can be especially effective in critical contexts, such as investigations, identity verification, and combating fraud.

As artificial intelligence continues to advance, the race is no longer just technological but also cognitive. Learning to distrust what seems too real can become an essential skill. And, as the research shows, sometimes all the brain needs is five minutes to start seeing the digital world with different eyes.

mundophone


TECH


Tiny tech, big AI power: What are 2-nanometer chips?

Taiwan's world-leading microchip manufacturer TSMC says it has started mass producing next-generation "2-nanometer" chips.

TSMC is the world's largest contract maker of chips, used in everything from smartphones to missiles, and counts Nvidia and Apple among its clients.

"TSMC's 2nm (N2) technology has started volume production in 4Q25 as planned," TSMC said in an undated statement on its website.

The chips will be the "most advanced technology in the semiconductor industry in terms of both density and energy efficiency," the company said.

"N2 technology, with leading nanosheet transistor structure, will deliver full-node performance and power benefits to address the increasing need for energy-efficient computing."

The chips will be produced at TSMC's "Fab 20" facility in Hsinchu, in northern Taiwan, and "Fab 22" in the southern port city of Kaohsiung.

More than half of the world's semiconductors, and nearly all of the most advanced ones used to power artificial intelligence technology, are made in Taiwan.

TSMC has been a massive beneficiary of the frenzy in AI investment. Nvidia and Apple are among firms pouring many billions of dollars into chips, servers and data centers.

AI-related spending is soaring worldwide, and is expected to reach approximately $1.5 trillion by 2025, according to US research firm Gartner, and over $2 trillion in 2026—nearly two percent of global GDP.

Taiwan's dominance of the chip industry has long been seen as a "silicon shield" protecting it from an invasion or blockade by China—which claims the island is part of its sovereign territory—and an incentive for the United States to defend it.

But the threat of a Chinese attack has fueled concerns about potential disruptions to global supply chains and has increased pressure for more chip production beyond Taiwan's shores.

Chinese fighter jets and warships encircled Taiwan during live-fire drills this week aimed at simulating a blockade of the democratic island's key ports and assaults on maritime targets.

Taipei, which slammed the two-day war games as "highly provocative and reckless," said the maneuver failed to impose a blockade on the island.

TSMC has invested in chip fabrication facilities in the United States, Japan and Germany to meet soaring demand for semiconductors, which have become the lifeblood of the global economy.

But in an interview with AFP this month, Taiwanese Deputy Foreign Minister Francois Chih-chung Wu said the island planned to keep making the "most advanced" chips on home soil and remain "indispensable" to the global semiconductor industry.

AFP looks at what that means, and why it's important:

-What can they do? The computing power of chips has increased dramatically over the decades as makers cram them with more microscopic electronic components. That has brought huge technological leaps to everything from smartphones to cars, as well as the advent of artificial intelligence tools like ChatGPT.

-Advanced 2-nanometer (2nm) chips perform better and are more energy-efficient than past types, and are structured differently to house even more of the key components known as transistors.

-The new chip technology will help speed up laptops, reduce data centers' carbon footprint and allow self-driving cars to spot objects quicker, according to US computing giant IBM.

-For artificial intelligence, "this benefits both consumer devices—enabling faster, more capable on-device AI—and data center AI chips, which can run large models more efficiently", said Jan Frederik Slijkerman, senior sector strategist at Dutch bank ING.

Who makes them? Producing 2nm chips, the most cutting-edge in the industry, is "extremely hard and expensive", requiring "advanced lithography machines, deep knowledge of the production process, and huge investments", Slijkerman told AFP. Only a few companies are able to do it: TSMC, which dominates the chip manufacturing industry, as well as South Korea's Samsung and US firm Intel.

TSMC is in the lead, with the other two "still in the stage of improving yield" and lacking large-scale customers, said TrendForce analyst Joanne Chiao.

Japanese chipmaker Rapidus is also building a plant in northern Japan to make 2nm chips, with mass production slated for 2027.

What's the political impact? TSMC's path to mass 2nm production has not always been smooth. Taiwanese prosecutors charged three people in August with stealing trade secrets related to 2nm chips to help Tokyo Electron, a Japanese company that makes equipment for TSMC.

"This case involves critical national core technologies vital to Taiwan's industrial lifeline," the high prosecutors' office said at the time. Geopolitical factors and trade wars are also at play.

Nikkei Asia reported this summer that TSMC, which counts Nvidia and Apple among its clients, will not use Chinese chipmaking equipment in its 2nm production lines to avoid disruption from potential US restrictions.

TSMC says they plan to speed up production of 2nm chips in the United States, currently targeted for "the end of the decade".

How small is two nanometers? Extremely tiny—for reference, an atom is approximately 0.1 nanometers across. But in fact 2nm does not refer to the actual size of the chip itself, or any chip components, and is just a marketing term. Instead "the smaller the number, the higher the density" of these components, Chiao told AFP.

IBM says 2nm designs can fit up to 50 billion transistors, tiny components smaller than a virus, on a chip the size of a fingernail.

To create the transistors, slices of silicon are etched, treated and combined with thin films of other materials.

A higher density of transistors results in a smaller chip or one the same size with faster processing power.

Can chips get even better? Yes, and TSMC is already developing "1.4-nanometer" technology, reportedly to go into mass production around 2028, with Samsung and Intel not far behind.

TSMC started high-volume 3nm production in 2023, and Taiwanese media says the company is already building a 1.4nm chip factory in the city of Taichung.

As for 2nm chips, Japan's Rapidus says they are "ideal for AI servers" and will "become the cornerstone of the next-generation digital infrastructure", despite the huge technical challenges and costs involved.

© 2025 AFP

Tuesday, December 30, 2025


TECH


China suffers setback from the US and Europe, and control of rare earths begins to change

For decades, the global rare earth supply chain seemed like immutable territory. A single hub concentrated production, processing, and technology, while the rest of the world accepted dependence as an inevitable cost of progress. This scenario has begun to change silently. Political pressures, industrial risks, and recent strategic decisions are accelerating a reconfiguration that few imagined possible—and that could alter the future of technology, energy, and heavy industry.

The invisible link that sustains modern technology...Few people see it, but almost everything depends on them. Rare earth magnets are at the heart of electric motors, wind turbines, electric vehicles, smartphones, drones, defense systems, and medical equipment. They are small but irreplaceable components for energy efficiency and technological miniaturization.

The problem has never been just the extraction of these elements, but the control of the subsequent steps: refining, chemical separation, and manufacturing of high-performance magnets. Over the years, this know-how became concentrated in a single country, creating a structural dependency that went unnoticed while supply chains functioned without shocks.

This balance began to crumble when trade tensions, export restrictions, and diplomatic disputes transformed a technical issue into a strategic one. Suddenly, governments and companies realized that the energy and digital transition depended on an extremely fragile bottleneck.

The reaction didn't happen overnight. Plans had been shelved for years, but 2025 acted as a catalyst. The combination of successive warnings and geopolitical instability accelerated decisions that previously seemed expensive, slow, or politically sensitive. The focus shifted from short-term efficiency to long-term industrial security.

In this new context, the United States, the European Union, and strategic partners like Australia began to coordinate industrial policies: direct subsidies, regulatory support, public funding, and incentives to reindustrialize critical parts of the rare earth supply chain.

The most visible change didn't happen in speeches, but on the factory floor. In recent months, industrial projects have begun to move from the planning stage to reality, especially in Europe, where external dependence has become a central political issue. A new rare earth magnet production plant in the north of the continent has come to symbolize this strategic shift.

The project does not promise to replace the former global leader, at least not in the short term. The objective is different: to create redundancy, reduce vulnerabilities, and gain room for maneuver. Instead of breaking with the existing system, the bet is to dilute risk, creating alternative poles capable of supporting critical sectors in times of crisis.

Companies specializing in advanced materials have unexpectedly taken center stage on this chessboard. For years, they operated behind the scenes of the industry. Now, they have become key players in a broader geopolitical strategy. Their executives recognize that the demand does not come from a single sector, but from virtually any technology that needs to convert energy efficiently.

This cross-cutting nature changes everything. It's not just about electric cars or renewable energy, but about the basic infrastructure of the modern economy. Therefore, governments have begun to treat rare earth magnets in the same way as semiconductors: as strategic assets, not as mere commodities.

Even so, no one is talking about total independence. The dominance accumulated over decades does not disappear quickly. Complex supply chains require time, scale, and highly specialized human capital. What is at stake is a new equilibrium—less concentrated, more resilient, and politically predictable.

This movement is already beginning to influence investment decisions, trade agreements, and industrial strategies. For the first time in a long time, the almost absolute control of this market faces concrete, albeit partial, alternatives. And this, in itself, is already a game-changer.

The scenario that is emerging is not one of abrupt replacement, but of a gradual redistribution of power. A silent, technical, and slow process—exactly the type of change that usually goes unnoticed until its effects become impossible to ignore.

mundophone


TECH


Big tech blocks California data center rules, leaving only a study requirement

Tools that power artificial intelligence devour energy. But attempts to shield regular Californians from footing the bill in 2025 ended with a law requiring regulators to write a report about the issue by 2027.

If that sounds pretty watered down, it is. Efforts to regulate the energy usage of data centers — the beating heart of AI — ran headlong into Big Tech, business groups and the governor. 

That’s not surprising given that California is increasingly dependent on big tech for state revenue: A handful of companies pay upwards of $5 billion just on income tax withholding.

The law mandating the report is the lone survivor of last year’s push to rein in the data-center industry. Its deadline means the findings won’t likely be ready in time for lawmakers to use in 2026. The measure began as a plan to give data centers their own electricity rate, shielding households and small businesses from higher bills.

It amounts to a “toothless” measure, directing the utility regulator to study an issue it already has the authority to investigate, said Matthew Freedman, a staff attorney with The Utility Reform Network, a ratepayer advocate.

Data centers’ enormous electricity demand has pushed them to the center of California’s energy debate, and that’s why lawmakers and consumer advocates say new regulations matter.

For instance, the sheer amount of energy requested by data centers in California is prompting questions about costly grid upgrades even as speculative projects and fast-shifting AI loads make long-term planning uncertain. Developers have requested 18.7 gigawatts of service capacity for data centers, more than enough to serve every household in the state, according to the California Energy Commission.

But the report could help shape future debates as lawmakers revisit tougher rules and the CPUC considers new policies on what data centers pay for power – a discussion gaining urgency as scrutiny of their rising electricity costs grows, he said.

“It could be that the report helps the Legislature to understand the magnitude of the problem and potential solutions,”  Freedman said. “It could also inform the CPUC’s own review of the reasonableness of rates for data center customers, which they are likely to investigate.”

State Sen. Steve Padilla, a Democrat from Chula Vista, says that the final version of his law “was not the one we would have preferred,” agreeing that it may seem “obvious” the CPUC can study data center cost impacts.  The measure could help frame future debates and at least “says unequivocally that the CPUC has the authority to study these impacts” as demand from data centers accelerates, Padilla added.

“(Data centers) consume huge amounts of energy, huge amounts of resources, and at least in the near future, we’re not going to see that change,” he said.

Earlier drafts of Padilla’s measure went further, requiring data centers to install large batteries to support the grid during peak demand and pushing utilities to supply them with 100% carbon-free electricity by 2030 — years ahead of the state’s own mandate. Those provisions were ultimately stripped out.

How California’s first push to regulate data centers slipped away...California’s bid to bring more oversight to data centers unraveled earlier this year under industry pressure, ending with Gov. Gavin Newsom’s veto of a bill requiring operators to report their water use. Concerns over the bills reflected fears that data-center developers could shift projects to other states and take valuable jobs with them.

A September Stanford report on powering California data centers said the state risks losing property-tax revenue, union construction jobs and “valuable AI talent” if data-center construction moves out of state.

The idea that increased regulation could lead to businesses or dollars in some form leaving California is an argument that has been brought up across industries for decades. It often does not hold up to more careful or long-term scrutiny. 

In the face of this opposition, two key proposals stalled in the Legislature’s procedural churn. Early in the session, Padilla put a separate clean-power incentives proposal for data centers on hold until 2026. Later in the year, an Assembly bill requiring data centers to disclose their electricity use was placed in the Senate’s suspense file – where appropriations committees often quietly halt measures.

Newsom, who has often spoken of California’s AI dominance, echoed the industry’s competitiveness worries in his veto message of the water-use reporting requirement. The governor said he was reluctant to impose requirements on data centers, “without understanding the full impact on businesses and the consumers of their technology.”

Despite last year’s defeats, some lawmakers say they will attempt to tackle the issue again.

Padilla plans to try again with a bill that would add new rules on who pays for data centers’ long-term grid costs in California, while Assemblymember Rebecca Bauer-Kahan — a Democrat from San Ramon — will revisit her electricity-disclosure bill.

Big Tech warns of job losses but one advocate sees an opening...After blocking most measures last year — and watering down the lone energy-costs bill — Big Tech groups say they’ll revive arguments that new efforts to regulate data centers could cost California jobs.

At a CalMatters event in November, Silicon Valley Leadership Group CEO Ahmad Thomas argued that California must compete to attract investments like the $40 billion data-center project Texas secured with  Google. Any policy making deals like that tougher next year would provoke conflict, he added.

“When we get to the details of what our regulatory regime looks like versus other states, or how we can make California more competitive…that’s where sometimes we struggle to find that happy medium,” he said.

Despite having more regulations than some states, California continues to toggle between the 4th and 5th largest economy in the world and has for some time, suggesting that the Golden State is very competitive. 

Dan Diorio, vice president of state policy for the Data Center Coalition, another industry lobbying group, said new requirements on data centers should apply to all other large electricity users.

“To single out one industry is not something that we think would set a helpful precedent, ” Diorio said. “We’ve been very consistent with that throughout the country.”

Critics say job loss fears are overblown, noting California built its AI sector without the massive hyperscale facilities that typically gravitate to states with ample, cheaper land and streamlined permitting.

Data-center locations — driven by energy prices, land and local rules — have little to do with where AI researchers live, said Shaolei Ren, an AI researcher at UC Riverside.

“These two things are sort of separate, they’re decoupled,” he said.  

Freedman, of TURN, said lawmakers may have a bargaining chip: if developers cared about cheaper power, they wouldn’t be proposing facilities in a state with high electric rates. That means speed and certainty may be the priority, giving lawmakers the space to potentially offer quicker approvals in exchange for developers covering more grid costs. 

“There’s so much money in this business that the energy bills – even though large – are kind of like rounding errors for these guys,” Freedman said. “If that’s true, then maybe they shouldn’t care about having to pay a little bit more to ensure that costs aren’t being shifted to other customers.”

Alejandro Lazo--https://muckrack.com/alejandrolazo

Monday, December 29, 2025

 

TECH


Refractory satellites: a radical proposal to reduce space pollution

According to a mundophone report, European researchers are advocating a profound change in how satellites are designed, aiming to reduce a little-discussed side effect of the commercial space race: chemical pollution of the Earth's atmosphere. The idea is simple and controversial at the same time. Instead of creating satellites designed to disintegrate upon re-entry into the atmosphere, why not make them resistant enough to survive the fall?

Today, thousands of satellites reach the end of their useful life every year and are deliberately launched on re-entry trajectories. As they burn up in the atmosphere, they fragment completely, preventing the formation of space debris in orbit and reducing the risk of debris hitting the ground. This concept, known as design for demise, has become standard in the space industry.

However, with the increasing number of satellites, this practice has come to have a significant environmental cost.

When satellites become chemical pollution...As satellites disintegrate during reentry, they release microscopic particles of aluminum oxide into the stratosphere. These compounds catalyze chemical reactions that accelerate the destruction of ozone, a layer essential for protecting Earth from ultraviolet radiation.

A study published in 2024 showed that a typical satellite, weighing about 250 kilograms and composed of approximately 30% aluminum, can generate about 30 kilograms of aluminum oxide nanoparticles when it burns up in the atmosphere. According to the researchers, the increase in the number of reentries contributed to an eightfold increase in the concentration of these harmful oxides over just six years.

It is in this context that engineers at MaiaSpace, a European company linked to the Ariane group, propose a path opposite to that adopted so far.

Making satellites “indestructible”...In a recent article, researchers Antoinette Ott and Christophe Bonnal advocate for the so-called design for non-demise. The proposal is to design satellites capable of withstanding the extreme heat of atmospheric reentry, performing a controlled descent to remote ocean regions, such as isolated areas of the Pacific.

This strategy would drastically reduce the release of chemical particles into the atmosphere, but it raises new dilemmas. More robust satellites would be more expensive, require additional propulsion and fuel systems, and increase the risk, albeit controlled, of debris reaching the Earth's surface.

For the authors, the central question becomes a risk assessment. Is it better to accept a small risk of impact on the ground or to continue accumulating long-term chemical damage in the atmosphere?

The answer is far from consensual, but the debate indicates that space pollution is no longer just an orbital problem. It is beginning to literally enter the planet's climate equation.

mundophone


CES 2026


LG CLOiD: LG's home robot

LG Electronics is preparing to unveil the LG CLOiD, a new home robot designed to take over some of the routine tasks at home, at CES 2026 in Las Vegas, from January 6 to 9. The company frames this launch within its "Zero Labor Home, Makes Quality Time" vision, which seeks to reduce the effort associated with household chores through advanced automation and artificial intelligence.

According to LG, the LG CLOiD is designed to act as a home assistant in an indoor environment, focusing on convenience and natural interaction with users. The robot will be demonstrated in "Zero Labor Home" scenarios at the brand's official booth at the Las Vegas Convention Center, where the company intends to illustrate how robotics integrates into the ecosystem of connected appliances and smart home services.

LG also indicates that it is accelerating its investment in robotics as a new growth area, through the creation of the HS Robotics Lab in the Home Appliance & Air Solution unit and research partnerships with robotics companies in Korea and other markets, reinforcing its ambition to make robotics a natural extension of its premium home appliance offering.

The most distinctive element of the LG CLOiD is its manipulation system. The company describes the robot as having two articulated arms, with motors that provide seven degrees of freedom in each arm, bringing the movements closer to human gestures. This design theoretically allows for greater flexibility in handling objects of different shapes and positions, compared to domestic robots limited to very simple movements.

Each hand integrates five individually acting fingers, which, according to LG, offers finer dexterity for tasks that require precision, such as grasping small objects, adjusting positions, or operating physical controls. In practice, the type of tasks that the LG CLOiD will be able to consistently perform will depend not only on this hardware, but also on the control models, visual perception, and the ability to adapt to the highly variable home environment.

In the head module, LG integrates a chipset described as the “brain” of the LG CLOiD, accompanied by a screen, speaker, camera, and additional sensors, forming the core of the robot's perception, communication, and decision-making. The brand says the system uses artificial intelligence to learn from repeated interactions with residents, gradually adjusting the type of help and responses offered.

This concept is presented as “Affectionate Intelligence,” an approach in which the robot seeks to adapt its behavior to the preferences and routines of each home, rather than simply offering pre-programmed responses. However, the statement does not go into detail about where this data is processed (locally or in the cloud), nor about privacy policies, data retention, or consent mechanisms, critical aspects when discussing sensors and image and sound capture in the domestic context.

Potential, limitations, and open questions...The announcement positions the LG CLOiD as a central piece of LG's smart home strategy, but leaves several essential elements open for evaluating technological maturity and practical relevance. Information regarding battery life, continuous operating time, navigation capabilities on complex floor surfaces, drop resistance, physical safety mechanisms, or software update protocols is not disclosed.

Price, commercial launch schedule, or target markets are also not mentioned, indicating that, at this stage, the focus is on demonstrating the concept and asserting leadership in home robotics, rather than on a product with a fully defined commercial roadmap. For users and smart home integrators, the immediate value lies in following the evolution of the platform – especially the combination of five-fingered arms, sensors, and AI – to see if the LG CLOiD will be a viable consumer product or an intermediate step in a broader portfolio of home robots.

by mundophone

Sunday, December 28, 2025

 

APPLE


Leaker that Apple sued doubles down with a huge iPhone foldable leak

Ever dreamt of an iPhone that transforms into an iPad? If you have, the iPhone Fold renders leaked by Jon Prosser, aka YouTube's fpt., will be of interest. Jon uploaded a video showcasing the some official-looking renders of the upcoming device, which when folded, appears similar to a standard 5.5-inch iPhone with 9 millimeters of thickness.

While the thinnest parts of iPhone Air are only 5.6 millimeters thick, keep in mind that the camera bump on that device substantially increases that number—in comparison, the purported iPhone Fold justifies its thickness by unfolding into a gorgeous 7.8-inch OLED screen, similar to the iPad Mini's 7.9-inch screen size. If real, it's a gorgeous-looking device, and if Apple actually achieves its rumored goal of removing the visible crease from the unfolded screen, it would mark a first for foldable smartphones.

The rumored price range for the iPhone Fold is between $2000 to $2500 USD, which would make it the most expensive iPhone ever. Considering the high cost of Android-based foldables compared to other phones, we suppose this makes sense, but that's a steep ask for even the most ardent Apple fan.

In any case, Prosser seems confident in his sources. Since he's previously been sued by Apple for leaking iOS 26 and has a long history of reputable Apple leaks, he's about as reliable an Apple leaker can get. The leaks of the iPhone 17e from earlier this month are also lent some legitimacy through South Korean news outlet The Elec, so Apple's got some non-foldable goodness on the way, too.

Based on the model, Apple has decided to prioritize the form factor of the iPhone Fold when it is unfolded, where the aspects closely resemble those of an iPad mini. That means that when folded, the iPhone Fold’s shape is a lot more square than a regular iPhone. That’s a different approach from Android folding phones, which often have tall, skinny outer displays to maintain the folded form as a more traditional rectangular shape. Thus, the inside of the folding iPhone will be more widescreen than foldable offerings from Samsung and Google.

We could get a sense of how real this square-ish form is at WWDC in June. Though Apple won’t specifically address this new form factor, the company could introduce methods for developers to vary app UI designs so that their apps will be able to adjust and maintain usability. It could also go so far as to preview the folding phone months before it releases, as it did with the original iPhone back in 2007.

Are you interested in the iPhone Fold or (slightly) more affordable foldable smartphones? Do you hope this marks the beginning of truly crease-less foldables?

mundophone

  DIGITAL LIFE How to Learn to Spot Flaws in AI-Created Faces Human faces have always been one of the most powerful signs of trustworthiness...