Wednesday, March 25, 2026

Google bumps up Q Day deadline to 2029, far sooner than previously thought


Google is dramatically shortening its deadline readiness for the arrival of Q Day, the point at which existing quantum computers can break public-key cryptography algorithms that secure decades' worth of secrets belonging to militaries, banks, governments, and nearly every individual on earth.

In a post published on Wednesday, Google said it is giving itself until 2029 to prepare for this event. The post went on to warn that the rest of the world needs to follow suit by adopting PQC—short for post-quantum cryptography—algorithms to augment or replace elliptic curves and RSA, both of which will be broken.

The end is nigh

“As a pioneer in both quantum and PQC, it’s our responsibility to lead by example and share an ambitious timeline,” wrote Heather Adkins, Google’s VP of security engineering, and Sophie Schmieg, a senior cryptography engineer. “By doing this, we hope to provide the clarity and urgency needed to accelerate digital transitions not only for Google, but also across the industry.”

Read full article

Comments

Reference : https://ift.tt/98OGBMK

What Happens When You Host an AI Café




“Can I get an interview?” “Can I get a job when I graduate?” Those questions came from students during a candid discussion about artificial intelligence, capturing the anxiety many young people feel today. As companies adopt AI-driven interview screeners, restructure their workforces, and redirect billions of dollars toward AI infrastructure, students are increasingly unsure of what the future of work will look like.

We had gathered people together at a coffee shop in Auburn, Alabama for what we called an AI Café. The event was designed to confront concerns about AI directly, demystifying the technology while pushing back against the growing narrative of technological doom.

AI is reshaping society at breathtaking speed. Yet the trajectory of this transformation is being charted primarily by for-profit tech companies, whose priorities revolve around market dominance rather than public welfare. Many people feel that AI is something being done to them rather than developed with them.

As computer science and liberal arts faculty at Auburn University, we believe there is another path forward: One where scholars engage their communities in genuine dialogue about AI. Not to lecture about technical capabilities, but to listen, learn, and co-create a vision for AI that serves the public interest.

The AI Café Model

Last November, we ran two public AI Cafés in Auburn. These were informal, 90-minute conversations between faculty, students, and community members about their experiences with AI. In these conversational forums, participants sat in clusters, questions flowed in multiple directions, and lived experience carried as much weight as technical expertise.

We avoided jargon and resisted attempts to “correct” misconceptions, welcoming whatever emotions emerged. One ground rule proved crucial: keeping discussions in the present, asking participants where they encounter AI today. Without that focus, conversations could easily drift to sci-fi speculation. Historical analogies—to the printing press, electricity, and smartphones—helped people contextualize their reactions. And we found that without shared definitions of AI, people talked past each other; we learned to ask participants to name specific tools they were concerned about.

A pair of photos show people in chairs in a cafe raising their hands, and 3 people smiling in front of the audience. Organizers Xaq Frohlich, Cheryl Seals, and Joan Harrell (right) held their first AI Café in a welcoming coffee shop and bookstore. Well Red

Most importantly, we approached these events not as experts enlightening the masses, but as community members navigating complex change together.

What We Learned by Listening

Participants arrived with significant frustration. They felt that commercial interests were driving AI development “without consideration of public needs,” as one attendee put it. This echoed deeper anxieties about technology, from social media algorithms that amplify division to devices that profit from “engagement” and replace meaningful face-to-face connection. People aren’t simply “afraid of AI.” They’re weary of a pattern where powerful technologies reshape their lives while they have little say.

Yet when given space to voice concerns without dismissal, something shifted. Participants didn’t want to stop AI development; they wanted to have a voice in it. When we asked, “What would a human-centered AI future look like?” the conversation became constructive. People articulated priorities: fairness over efficiency, creativity over automation, dignity over convenience, community over individualism.

Three people standing together in front of a yellow curtain at an indoor event. The three organizers, all professors at Alabama’s Auburn University, say that including people from the liberal arts fields brought new perspectives to the discussions about AI. Well Red

For us as organizers, the experience was transformative. Hearing how AI affected people’s work, their children’s education, and their trust in information prompted us to consider dimensions we hadn’t fully grasped. Perhaps most striking was the gratitude participants expressed for being heard. It wasn’t about filling knowledge deficits; it was about mutual learning. The trust generated created a spillover effect, renewing faith that AI could serve the public interest if shaped through inclusive processes.

How to Start Your Own AI Café

The “deficit model” of science communication—where experts transmit knowledge to an uninformed public—has been discredited. Public resistance to emerging technologies reflects legitimate concerns about values, risks, and who controls decision-making. Our events point toward a better model.

We urge engineering and liberal arts departments, professional societies, and community organizations worldwide to organize dialogues similar to our AI Cafés.

We found that a few simple design choices made these conversations far more productive. Informal and welcoming spaces such as coffee shops, libraries, and community centers helped participants feel comfortable (and serving food and drinks helped too!). Starting with small-group discussions, where people talked with neighbors, produced more honest thinking and greater participation. Partnering with colleagues in the liberal arts brought additional perspectives on technology’s social dimensions. And by making a commitment to an ongoing series of events, we built trust.

Facilitation also matters. Rather than leading with technical expertise, we began with values: We asked what kind of world participants wanted, and how AI might help or hinder that vision. We used analogies to earlier technologies to help people situate their reactions, and grounded discussions in present realities, asking participants where they have encountered AI in their daily lives. We welcomed emotions constructively, transforming worry into problem-solving by asking questions like: “What would you do about that?”

Why Engineers Should Engage the Public

Professional ethics codes remain abstract unless grounded in dialogue with affected communities. Conversations about what “responsible AI” means will look different in São Paulo than in Seoul, in Vienna than in Nairobi. What makes the AI Café model portable is its general principles: informal settings, values-first questions, present-tense focus, genuine listening.

Without such engagement, ethical accountability quietly shifts to technical experts rather than remaining a shared public concern. If we let commercial interests define AI’s trajectory with minimal public input, it will only deepen divides and entrench inequities.

AI will continue advancing whether or not we have public trust. But AI shaped through dialogue with communities will look fundamentally different from AI developed solely to pursue what’s technically possible or commercially profitable.

The tools for this work aren’t technical; they’re social, requiring humility, patience, and genuine curiosity. The question isn’t whether AI will transform society. It’s whether that transformation will be done to people or with them. We believe scholars must choose the latter, and that starts with showing up in coffee shops and community centers to have conversations where we do less talking and more listening.

The future of AI depends on it.


Reference: https://ift.tt/xZsRvo2

Are U.S. Engineering Ph.D. Programs Losing Students?




U.S. doctoral programs in electrical engineering form the foundation of technological advancement, training the brightest minds in the world to research, develop, and design next-generation electronics, software, electrical infrastructure, and other high-tech products and systems. Elite institutions have long served as launchpads for the engineers behind tomorrow’s technology.

Now that foundation is under strain.

With U.S. universities increasingly entangled in political battles under the second Trump administration, uncertainty is beginning to ripple through doctoral admissions for electrical engineering programs. While some departments are reducing the number of spots available in anticipation of potential federal funding cuts, others are seeing their applicant pools shrink, particularly among international students, who make up a significant portion of their programs.

In 2024 alone, U.S. universities awarded more than 2,000 doctorates in electrical and computer engineering, according to data from the National Center for Science and Engineering Statistics. The number of computing Ph.D.s grew significantly in the 2010s, according to data from the National Academies, but there is still high demand for those with advanced degrees across academia, government, and industry. Now, some universities point to warning signs of waning enrollment.

Though not all engineers have Ph.D.s, if enrollment continues to shrink, fewer doctoral students could mean fewer engineers developing cutting-edge technology and training the next generation, potentially exacerbating existing labor shortages as global competition for tech talent intensifies.

Federal funding cuts affect admissions

Public universities in particular are feeling the strain because they rely heavily on federal grants to support doctoral students.

The University of California, Los Angeles, for instance, must fund Ph.D. students for the duration of a degree—typically five years. In August 2025, the U.S. government pulled more than US $580 million in federal grants over allegations that the university failed to adequately address antisemitism on campus during student protests. A federal judge has since ordered the funding to be restored, but faculty began to worry that research support could be clawed back without notice, says Subramanian Iyer, distinguished professor at UC Los Angeles’s department of electrical and computer engineering.

According to Iyer, departments across UC Los Angeles, including engineering, plan to scale back Ph.D. admissions this year. “The fear is that at some point, all this government money will be taken away,” Iyer says. “Lowering the admissions rate is just a way to prepare for that reality.”

In response to a request for comment, a spokesperson for the U.S. National Science Foundation—a major source of federal research funding at UC Los Angeles and elsewhere—said, “NSF recognizes the essential role doctoral trainees play in the nation’s engineering and STEM enterprise” and noted several of the foundation’s awards and programs that support graduate research.

Funding shocks may also force Pennsylvania State University to reshape future admissions decisions, according to Madhavan Swaminathan, head of Penn State’s electrical engineering department and director of the Center for Heterogeneous Integration of Micro Electronic Systems (CHIMES), a semiconductor research lab.

In 2023, the Defense Advanced Research Projects Agency (DARPA) and industry partners awarded CHIMES a five-year $32.7 million grant. But in late 2025, the agency pulled its final year of funding from the center, citing a shift in priorities from microelectronics to photonics, Swaminathan says. As a result, CHIMES’ annual budget, which supports research assistantships for roughly 100 engineering graduate students, the majority pursuing Ph.D.s, will fall from $7 million in 2026 to $3.5 million in 2027. If these constraints persist, Penn State’s engineering department may reduce the number of doctoral students it supports.

In a statement, a DARPA spokesperson told IEEE Spectrum: “Basic research is central to identifying world-changing technologies, and DARPA remains committed to engaging academic institutions in our program research. By design, a DARPA program typically lasts about 3 to 5 years. Once we establish proof of concept, we transition the technology for further development and turn our attention to other challenging areas of research.”

Penn State’s enrollment numbers reflect Swaminathan’s caution. He says the electrical engineering Ph.D. cohort shrank from 28 students in 2024 to 15 students in 2025. Applications show a similar pattern. After rising from 195 in 2024 to 247 in 2025, Ph.D. applications fell roughly 30 percent to 174 for the upcoming 2026 cohort, a sign that prospective students may be wary of applying to U.S. programs.

Immigration restrictions and application declines

In late January, the Trump administration announced it had paused visa approvals for citizens of 75 countries. Months earlier, the administration proposed new restrictions on student visas, including a four-year cap.

For Texas A&M University’s graduate electrical and computer engineering programs, up to 80 percent of applicants each year are international students, according to Narasimha Annapareddy, professor and head of the department. Annapareddy says applications for the fall 2026 Ph.D. cohort have dropped by roughly 50 percent.

Annapareddy says the United States is “sending a message that migration is going to be more difficult in the future.” Foreign students often pursue degrees in the U.S. not only for academic training, he says, but to build long-term careers and lives in the country. Fewer applications from international students mean that the university forgoes a “driven and hungry” segment of the applicant pool who are highly qualified in technical fields.

“The fear is that at some point, all this government money will be taken away.”— Subramanian Iyer, UC Los Angeles

At the University of Southern California, the decline is more moderate. The freshman Ph.D. class fell from about 90 students in 2024 to roughly 70 in 2025, a reduction of 22 percent, according to Richard Leahy, department chair of USC’s Ming Hsieh Department of Electrical and Computer Engineering.

While Leahy says applications are down modestly overall, domestic applications have increased by roughly 15 percent. Beyond immigration restrictions, international students, particularly from countries such as India and China, may be staying in their home countries as their technology sectors expand.

“A lot of those students that would normally have come to the U.S. are now taking very good jobs working in the AI industry and other areas,” Leahy says. “There are a lot more opportunities now.”

Workforce pipeline strains

Some faculty say shrinking cohorts could erode the tech workforce if the pattern continues.

At UC Los Angeles, Iyer describes a doctoral ecosystem built on a chain of mentorship. Among the roughly 25 students in his lab, senior doctoral students mentor junior Ph.D. candidates, who in turn guide master’s students and undergraduates. The system depends on overlapping cohorts. Reducing the number of students hired weakens that overlap and the trickle-down benefits of the mentorship model that keeps labs functioning.

The real benefit of the university system isn’t just the teaching but also “the community that you build,” Iyer says. “As you decrease admissions, this will disappear.”

At Penn State, Swaminathan points to specialization as key to a strong workforce. Many doctoral students train in semiconductor engineering, feeding expert talent into the domestic chip industry. If enrollment continues to shrink over the next few years, Swaminathan says, companies may need to hire students with bachelor’s or master’s degrees, who might lack the necessary skills required to design and innovate new chips.

“Without that specialization, there’s only so much one can do,” Swaminathan says.

The industry–academia gap

Not all departments are shrinking. At the University of Texas at Austin, overall enrollment has remained relatively steady, according to Diana Marculescu, chair of UT Austin’s Chandra Family Department of Electrical and Computer Engineering.

While she says recent fluctuations aren’t raising alarms, her concern lies more with alignment between research and industry. Doctoral students often train according to current grant priorities, she says. But by the time graduates enter the job market four to six years later, their specialization may not align neatly with open roles. That creates friction in the talent pipeline.

“That lack of connection might be problematic,” Marculescu says. She argues that closer collaboration between universities and the private sector could help create stronger feedback loops between hiring needs and academic research priorities.

For now, USC’s Leahy says Ph.D. graduates remain in high demand, and the current shifts have not yet translated into measurable workforce shortages. “We should be concerned about the number of Ph.D.s,” he says. “But there isn’t a crisis at this point.”

Reference: https://ift.tt/CSuEYzf

Tuesday, March 24, 2026

What Will It Take to Build the World’s Largest Data Center?




The undying thirst for smarter (historically, that means larger) AI models and greater adoption of the ones we already have has led to an explosion in data-center construction projects, unparalleled both in number and scale. Chief among them is Meta’s planned 5-gigawatt data center in Louisiana, called Hyperion, announced in June of 2025. Meta CEO Mark Zuckerberg said Hyperion will “cover a significant part of the footprint of Manhattan,” and the first phase—a 2-GW version—will be completed by 2030.

Though the project’s stated 5-GW scale is the largest among its peers, it’s just one of several dozen similar projects now underway. According to Michael Guckes, chief economist at construction-software company ConstructConnect, spending on data centers topped US $27 billion by July of 2025 and, once the full-year figures are tallied, will easily exceed $40 billion. Hyperion alone accounts for about a quarter of that.

For the engineers assigned to bring these projects to life, the mix of challenges involved represent a unique moment. The world’s largest tech companies are opening their wallets to pay for new innovations in compute, cooling, and network technology designed to operate at a scale that would’ve seemed absurd five years ago.

At the same time, the breakneck pace of building comes paired with serious problems. Modern data-center construction frequently requires an influx of temporary workers and sharply increases noise, traffic, pollution, and often local electricity prices. And the environmental toll remains a concern long after facilities are built due to the unprecedented 24/7 energy demands of AI data centers which, according to one recent study, could emit the equivalent of tens of millions of tonnes of CO2 annually in the United States alone.

Regardless of these issues, large AI companies, and the engineers they hire, are going full steam ahead on giant data-center construction. So, what does it really take to build an unprecedentedly large data center?

AI Rewrites Building Design

The stereotypical data-center building rests on a reinforced concrete slab foundation. That’s paired with a steel skeleton and poured concrete wall panels. The finished building is called a “shell,” a term that implies the structure itself is a secondary concern. Meta has even used gigantic tents to throw up temporary data centers.

Still, the scale of the largest AI data centers brings unique challenges. “The biggest challenge is often what’s under the surface. Unstable, corrosive, or expansive soils can lead to delays and require serious intervention,” says Robert Haley, vice president at construction consulting firm Jacobs. Amanda Carter, a senior technical lead at Stantec, said a soil’s thermal conductivity is also important, as most electrical infrastructure is placed underground. “If the soil has high thermal resistivity, it’s going to be difficult to dissipate [heat].” Engineers may take hundreds or thousands of soil samples before construction can begin.

GPUs


Yellow microchip icon on a black background.

Modern AI data centers often use rack-scale systems, such as the Nvidia GB200 NVL72, which occupy a single data-center rack. Each rack contains 72 GPUs, 36 CPUs, and up to 13.4 terabytes of GPU memory. The racks measure over 2.2 meters tall and weigh over one and a half tonnes, forcing AI data centers to use thicker concrete with more reinforcement to bear the load.

A single GB200 rack can use up to 120 kilowatts. If Hyperion meets its 5-gigawatt goals, the data-center campus could include over 41,000 rack-scale systems, for a total of more than 3 million GPUs. The final number of GPUs used by Hyperion is likely to be less than that, though only because future GPUs will be larger, more capable, and use more power.

Money


Black hand and dollar symbol combined on an orange background.

According to ConstructConnect, spending on data centers neared US $27 billion through July of 2025 and, according to the latest data, will tally close to $60 billion through the end of the year. Meta’s Hyperion project is a big slice of the pie, at $10 billion.

Data-center spending has become an important prop for the construction industry, which is seeing reduced demand in other areas, such as residential construction and public infrastructure. ConstructConnect’s third quarter 2025 financial report stated that the quarter’s decline “would have been far more severe without an $11 billion surge in data center starts.”


There’s apparently no shortage of eligible sites, however, as both the number of data centers under construction, and the money spent on them, has skyrocketed. The spending has allowed companies building data centers to throw out the rule book. Prior to the AI boom, most data centers relied on tried-and-true designs that prioritized inexpensive and efficient construction. Big tech’s willingness to spend has shifted the focus to speed and scale.

The loose purse strings open the door to larger and more robust prefabricated concrete wall and floor panels. Doug Bevier, director of development at Clark Pacific, says some concrete floor panels may now span up to 23 meters and need to handle floor loads up to 3,000 kilograms per square meter, which is more than twice the load international building codes normally define for manufacturing and industry. In some cases, the concrete panels must be custom-made for a project, an expensive step that the economics of pre-AI data centers rarely justified.

Simultaneously, the time scale for projects is also compressed: Jamie McGrath, senior vice president of data-center operations at Crusoe, says the company is delivering projects in “about 12 months,” compared to 30 to 36 months before. Not all projects are proceeding at that pace, but speed is universally a priority.

That makes it difficult to coordinate the labor and materials required. Meta’s Hyperion site, located in rural Richland Parish, Louisiana, is emblematic of this challenge. As reported by NOLA.com, at least 5,000 temporary workers have flocked to the area, which has only about 20,000 permanent residents. These workers earn above-average wages and bring a short-term boost for some local businesses, such as restaurants and convenience stores. However, they have also spurred complaints from residents about traffic and construction noise and pollution.

This friction with residents includes not only these obvious impacts, but also things you might not immediately suspect, such as light pollution caused by around-the-clock schedules. Also significant are changes to local water tables and runoff, which can reduce water quality for neighbors who rely on well water. These issues have motivated a few U.S. cities to enact data-center bans.

Data Centers Often Go BYOP (bring your own power)

Meta’s Richland Parish site also highlights a problem that’s priority No. 1 for both AI data centers and their critics: power.

Data centers have always drawn large amounts of power, which nudged data-center construction to cluster in hubs where local utilities were responsive to their demands. Virginia’s electric utility, Dominion Energy, met demand with agreements to build new infrastructure, often with a focus on renewable energy.

The power demands of the largest AI data centers, though, have caught even the most responsive utilities off guard. A report from the Lawrence Berkeley National Laboratory, in California, estimated the entire U.S. data-center industry consumed an average load of roughly 8 GW of power in 2014. Today, the largest AI data-center campuses are built to handle up to a gigawatt each, and Meta’s Hyperion is projected to require 5 GW.

“Data centers are exasperating issues for a lot of utilities,” says Abbe Ramanan, project director at the Clean Energy Group, a Vermont-based nonprofit.

Ramanan explains that utilities often use “peaker plants” to cope with extra demand. They’re usually older, less efficient fossil-fuel plants which, because of their high cost to operate and carbon output, were due for retirement. But Ramanan says increased electricity demand has kept them in service.

Meta secured power for Hyperion by negotiating with Entergy, Louisiana’s electric utility, for construction of three new gas-turbine power plants. Two will be located near the Richland Parish site, while a third will be located in southeast Louisiana.

Entergy frames the new plants as a win for the state. “A core pillar of Entergy and Meta’s agreement is that Meta pays for the full cost of the utility infrastructure,” says Daniel Kline, director of power-delivery planning and policy at Entergy. The utility expects that “customer bills will be lower than they otherwise would have been.” That would prove an exception, as a recent report from Bloomberg found electricity rates in regions with data centers are more likely to increase than in regions without.

CO2


Diagram of CO2 molecule with black carbon and red oxygen atoms connected by lines.

Research published in Nature in 2025 projects that data-center emissions will range from 24 million to 44 million CO2-equivalent metric tonnes annually through 2030 in the United States alone. While some materials used in data centers, such as concrete, lead to significant emissions, the majority of these emissions will result from the high energy demands of AI servers.

Estimating the carbon emissions of Hyperion is difficult, as the project won’t be completed until 2030. Assuming that the three new natural gas plants that are planned for construction as part of the project produce emissions typical for their type, however, the plants could lead to full life-cycle emissions of between 4 million and 10 million metric tons of CO2 annually—roughly equivalent to the annual emissions of a country like Latvia.

Concrete


Silhouette of a cement truck on an orange background.

Data centers are typically built from concrete, with steel used as a skeleton to reinforce and shape the concrete shell. While the foundation is often poured concrete, the walls and floors are most often built from prefabricated concrete panels that can span up to 23 meters. Floors use a reinforced T-shape, similar to a steel girder, measuring up to 1.2 meters across at its thickest point. The largest data centers include hundreds of these concrete panels.

The America Cement Association projects that the current surge in building will require 1 million tonnes of cement over the next three years, though that’s still a tiny fraction of the overall cement industry, which weighed in at roughly 103 million tonnes in 2024.


The plants, which will generate a combined 2.26 GW, will use combined-cycle gas turbines that recapture waste heat from exhaust. This boosts thermal efficiency to 60 percent and beyond, meaning more fuel is converted to useful energy. Simple-cycle turbines, by contrast, vent the exhaust, which lowers efficiency to around 40 percent.

Even so, total life-cycle emissions for the Hyperion plants could range from 4 million to over 10 million tonnes of CO2 each year, depending on how frequently the plants are put in use and the final efficiency benchmarks once built. On the high end, that’s as much CO2 as produced by over 2 million passenger cars. Fortunately, not all of Meta’s data centers take the same approach to power. The company has announced a plan to power Prometheus, a large data-center project in Ohio scheduled to come online before the end of 2026, with nuclear energy.

But other big tech companies, spurred by the need to build data centers quickly, are taking a less efficient approach.

xAI’s Colossus 2, located in Memphis, is the most extreme example. The company trucked dozens of temporary gas-turbine generators to power the site located in a suburban neighborhood. OpenAI, meanwhile, has gas turbines capable of generating up to 300 megawatts at its new Stargate data center in Abilene, Texas, slated to open later in 2026. Both use simple-cycle turbines with a much lower efficiency rating than the combined-cycle plants Entergy will build to power Hyperion.

Demand for gas turbines is so intense, in fact, that wait times for new turbines are up to seven years. Some data centers are turning toward refurbished jet engines to obtain the turbines they need.

AI Racks Tip the Scales

The demand for new, reliable power is driven by the power-hungry GPUs inside modern AI data centers.

In January of 2025, Mark Zuckerberg announced in a post on Facebook that Meta planned to end 2025 with at least 1.3 million GPUs in service. OpenAI’s Stargate data center plans to use over 450,000 Nvidia GB200 GPUs, and xAI’s Colossus 2, an expansion of Colossus, is built to accommodate over 550,000 GPUs.

GPUs, which remain by far the most popular for AI workloads, are bundled into human-scale monoliths of steel and silicon which, much like the data centers built to house them, are rapidly growing in weight, complexity, and power consumption.

Memory


Outlined head with a microchip brain on blue background, symbolizing AI and technology.

In addition to raw compute performance, Nvidia GB200 NVL72 racks also require huge amounts of memory. An Nvidia GB200 NVL72 rack may include up to 13.4 terabytes of high-bandwidth memory, which implies a data-center campus at Hyperion’s scale will require at least several dozen petabytes.

The immense demand has sent memory prices soaring: The price of DRAM, specifically DDR5, has increased 172 percent in 2025.

Power


Hyperion is expected to use 5 gigawatts of power across 11 buildings, which works out to just under 500 megawatts per building, assuming each will be similar to its siblings. That’s enough to power roughly 4.2 million U.S. homes.

Just one Hyperion data center built at the Richland Parish site will consume twice as much power as xAI’s Colossus which, at the time of its completion in the summer of 2024, was among the largest data centers yet built.


Nvidia’s GB200 NVL72—a rack-scale system—is currently a leading choice for AI data centers. A single GB200 rack contains 72 GPUs, 36 CPUs, and up to 17 terabytes of memory. It measures 2.2 meters tall, tips the scales at up to 1,553 kilograms, and consumes about 120 kilowatts—as much as around 100 U.S. homes. And this, according to Nvidia, is just the beginning. The company anticipates future racks could consume up to a megawatt each.

Viktor Petik, senior vice president of infrastructure solutions at Vertiv, says the rapid change in rack-scale AI systems has forced data centers to adapt. “AI racks consume far more power and weigh more than their predecessors,” says Petik. He adds that data centers must supply racks with multiple power feeds, without taking up extra space.

The new power demands from rack-scale systems have consequences that are reflected in the design of the data center—even its footprint.

In 2022 Meta broke ground on a new data center at a campus in Temple, Texas. According to SemiAnalysis, which studies AI data centers, construction began with the intent to build the data center in an H-shaped configuration common to other Meta data centers.

LAND


Black location pin icon on orange background.

Meta CEO Mark Zuckerberg kicked off the buzz around Hyperion by saying it would cover a large chunk of Manhattan. Many took that to mean Hyperion would be a single building of that size, which isn’t correct. Hyperion will actually be a cluster of data centers—11 are currently planned—with over 370,000 square meters of floor space. That’s a lot smaller even than New York City’s Central Park, which covers 6 percent of Manhattan.

Meta has room to grow, however. The Richland Parish site spans 14.7 million square meters in total, which is about a quarter the area of Manhattan. And the 370,000 square meters of floor space Hyperion is expected to provide doesn’t include external infrastructure, such as the three new combined-cycle gas power plants Louisiana utility Entergy is building to power the project.


Map with site layout and regional location in Louisiana, showing roads and distances.

Construction was paused midway in December of 2022, however, as part of a company-wide review of its data-center infrastructure. Meta decided to knock down the structure it had built and start from scratch. The reasons for this decision were never made public, but analysts believe it was due to the old design’s inability to deliver sufficient electricity to new, power-hungry AI racks. Construction resumed in 2023.

Meta’s replacement ditches the H-shaped building for simple, long, rectangular structures, each flanked by rows of gas-turbine generators. While Meta’s plans are subject to change, Hyperion is currently expected to comprise 11 rectangular data centers, each packed with hundreds of thousands of GPUs, spread across the 13.6-square-kilometer Richland Parish campus.

Cooling, and Connecting, at Scale

Nvidia’s ultradense AI GPU racks are changing data centers not only with their weight, and power draw, but also with their intense cooling and bandwidth requirements.

Data centers traditionally use air cooling, but that approach has reached its limits. “Air as a cooling medium is inherently inferior,” says Poh Seng Lee, head of CoolestLAB, a cooling research group at the National University of Singapore.

Instead, going forward, GPUs will rely on liquid cooling. However, that adds a new layer of complexity. “It’s all the way to the facilities level,” says Lee. “You need pumps, which we call a coolant distribution unit. The CDU will be connected to racks using an elaborate piping network. And it needs to be designed for redundancy.” On the rack, pipes connect to cold plates mounted atop every GPU; outside the data-center shell, pipes route through evaporation cooling units. Lee says retrofitting an air-cooled data center is possible but expensive.

The networking used by AI data centers is also changing to cope with new requirements. Traditional data centers were positioned near network hubs for easy access to the global internet. AI data centers, though, are more concerned with networks of GPUs.

These connections must sustain high bandwidth with impeccable reliability. Mark Bieberich, a vice president at network infrastructure company Ciena, says its latest fiber-optic transceiver technology, WaveLogic 6, can provide up to 1.6 terabytes per second of bandwidth per wavelength. A single fiber can support 48 wavelengths in total, and Ciena’s largest customers have hundreds of fiber pairs, placing total bandwidth in the thousands of terabits per second.


a piece of land with a big platform in the middle.

This is a point where the scale of Meta’s Hyperion, and other large AI data centers, can be deceptive. It seems to imply the physical size of a single data center is what matters. But rather than being a single building, Hyperion is actually a set of buildings connected by high-speed fiber-optics.

“Interconnecting data centers is absolutely essential,” says Bieberich. “You could think about it as one logical AI training facility, but with geographically distributed facilities.” Nvidia has taken to calling this “scale across,” to contrast it with the idea that data centers must “scale up” to larger singular buildings.

The Big but Hazy Future

The full scale of the challenges that face Hyperion, and other future AI data centers of similar scale, remain hazy. Nvidia has yet to introduce the rack-scale AI GPU systems it will host. How much power will it demand? What type of cooling will it require? How much bandwidth must be provided? These can only be estimated.

In the absence of details, the gravity of AI data-center design is pulled toward one certainty: It must be big. New data-center designers are rewriting their rule book to handle power, cooling, and network infrastructure at a scale that would’ve seemed ridiculous five years ago.

This innovation is fueled by big tech’s fat wallet, which shelled out tens of billions of dollars in 2025 alone, leading to questions about whether the spending is sustainable. For the engineers in the trenches of data-center design, though, it’s viewed as an opportunity to make the impossible possible.

“I tell my engineers, this is peak. We’re being engineers. We’re being asked complicated questions,” says Stantec’s Carter. “We haven’t got to do that in a long time.”

Reference: https://ift.tt/kaWYn92

Self-propagating malware poisons open source software and wipes Iran-based machines


A new hacking group has been rampaging the Internet in a persistent campaign that spreads a self-propagating and never-before-seen backdoor—and curiously a data wiper that targets Iranian machines.

The group, tracked under the name TeamPCP, first gained visibility in December, when researchers from security firm Flare observed it unleashing a worm that targeted cloud-hosted platforms that weren’t properly secured. The objective was to build a distributed proxy and scanning infrastructure and then use it to compromise servers for exfiltrating data, deploying ransomware, conducting extortion, and mining cryptocurrency. The group is notable for its skill in large-scale automation and integration of well-known attack techniques.

Relentless and constantly evolving

More recently, TeamPCP has waged a relentless campaign that uses continuously evolving malware to bring ever more systems under its control. Late last week, it compromised virtually all versions of the widely used Trivy vulnerability scanner in a supply-chain attack after gaining privileged access to the GitHub account of Aqua Security, the Trivy creator.

Read full article

Comments

Reference : https://ift.tt/Z6Brt9c

The Coming Drone-War Inflection in Ukraine




WHEN KYIV-BORN ENGINEER Yaroslav Azhnyuk thinks about the future, his mind conjures up dystopian images. He talks about “swarms of autonomous drones carrying other autonomous drones to protect them against autonomous drones, which are trying to intercept them, controlled by AI agents overseen by a human general somewhere.” He also imagines flotillas of autonomous submarines, each carrying hundreds of drones, suddenly emerging off the coast of California or Great Britain and discharging their cargoes en masse to the sky.

“How do you protect from that?” he asks as we speak in late December 2025; me at my quiet home office in London, he in Kyiv, which is bracing for another wave of missile attacks.

Azhnyuk is not an alarmist. He cofounded and was formerly CEO of Petcube, a California-based company that uses smart cameras and an app to let pet owners keep an eye on their beloved creatures left alone at home. A self-described “liberal guy who didn’t even receive military training,” Azhnyuk changed his mind about developing military tech in the months following the Russian invasion of Ukraine in February 2022. By 2023, he had relinquished his CEO role at Petcube to do what many Ukrainian technologists have done—to help defend his country against a mightier aggressor.

It took a while for him to figure out what, exactly, he should be doing. He didn’t join the military, but through friends on the front line, he witnessed how, out of desperation, Ukrainian troops turned to off-the-shelf consumer drones to make up for their country’s lack of artillery.

Ukrainian troops first began using drones for battlefield surveillance, but within a few months they figured out how to strap explosives onto them and turn them into effective, low-cost killing machines. Little did they know they were fomenting a revolution in warfare.

Group observes a drone demonstration indoors, with a presenter explaining features.

Compact black camera module with textured surface and orange ribbon cable on white background.The Ukrainian robotics company The Fourth Law produces an autonomy module [above] that uses optics and AI to guide a drone to its target. Yaroslav Azhnyuk [top, in light shirt], founder and CEO of The Fourth Law, describes a developmental drone with autonomous capabilities to Ukrainian President Volodymyr Zelenskyy and German Chancellor Olaf Scholz.Top: THE PRESIDENTIAL OFFICE OF UKRAINE; Bottom: THE FOURTH LAW

That revolution was on display last month, as the U.S. and Israel went to war with Iran. It soon became clear that attack drones are being extensively used by both sides. Iran, for example, is relying heavily on the Shahed drones that the country invented and that are now also being manufactured in Russia and launched by the thousands every month against Ukraine.

A thorough analysis of the Middle East conflict will take some time to emerge. And so to understand the direction of this new way of war, look to Ukraine, where its next phase—autonomy—is already starting to come into view. Outnumbered by the Russians and facing increasingly sophisticated jamming and spoofing aimed at causing the drones to veer off course or fall out of the sky, Ukrainian technologists realized as early as 2023 that what could really win the war was autonomy. Autonomous operation means a drone isn’t being flown by a remote pilot, and therefore there’s no communications link to that pilot that can be severed or spoofed, rendering the drone useless.

By late 2023, Azhnyuk set out to help make that vision a reality. He founded two companies, The Fourth Law and Odd Systems, the first to develop AI algorithms to help drones overcome jamming during final approach, the second to build thermal cameras to help those drones better sense their surroundings.

“I moved from making devices that throw treats to dogs to making devices that throw explosives on Russian occupants,” Azhnyuk quips.

Since then, The Fourth Law has dispatched “more than thousands” of autonomy modules to troops in eastern Ukraine (it declines to give a more specific figure), which can be retrofitted on existing drones to take over navigation during the final approach to the target. Azhnyuk says the autonomy modules, worth around US $50, increase the drone-strike success rate by up to four times that of purely operator-controlled drones.

And that is just the beginning. Azhnyuk is one of thousands of developers, including some who relocated from Western countries, who are applying their skills and other resources to advancing the drone technology that is the defining characteristic of the war in Ukraine. This eclectic group of startups and founders includes Eric Schmidt, the former Google CEO, whose company Swift Beat is churning out autonomous drones and modules for Ukrainian forces. The frenetic pace of tech development is helping a scrappy, innovative underdog hold at bay a much larger and better-equipped foe.

All of this development is careening toward AI-based systems that enable drones to navigate by recognizing features in the terrain, lock on to and chase targets without an operator’s guidance, and eventually exchange information with each other through mesh networks, forming self-organizing robotic kamikaze swarms. Such an attack swarm would be commanded by a single operator from a safe distance.

According to some reports, autonomous swarming technology is also being developed for sea drones. Ukraine has had some notable successes with sea drones, which have reportedly destroyed or damaged around a dozen Russian vessels.

Hand holding a drone with six rotors, outdoors against a blue sky.The Skynode X system, from Auterion, provides a degree of autonomy to a drone.AUTERION

For Ukraine, swarming can solve a major problem that puts the nation at a disadvantage against Russia—the lack of personnel. Autonomy is “the single most impactful defense technology of this century,” says Azhnyuk. “The moment this happens, you shift from a manpower challenge to a production challenge, which is much more manageable,” he adds.

The autonomous warfare future envisioned by Azhnyuk and others is not yet a reality. But Marc Lange, a German defense analyst and business strategist, believes that “an inflection point” is already in view. Beyond it, “things will be so dramatically different,” he says.

“Ukraine pretty rapidly realized that if the operator-to-drone ratio can be shifted from one-to-one to one-to-many, that creates great economies of scale and an amazing cost exchange ratio,” Lange adds. “The moment one operator can launch 100, 50, or even just 20 drones at once, this completely changes the economics of the war.”

Drones With a View

For a while, jammers that sever the radio links between drones and operators or that spoof GPS receivers were able to provide fairly reliable defense against human-controlled FPVs. But as autonomous navigation progressed, those electronic shields have gradually become less effective. Defenders must now contend with unjammable drones—ones that are attached to hair-thin optical fibers or that are capable of finding their way to their targets without external guidance. In this emerging struggle, the defenders’ track records aren’t very encouraging: The typical countermeasure is to try to shoot down the attacking drone with a service weapon. It’s rarely successful.

Truck on rural road covered with camouflage netting, trees and fields in the background.A truck outfitted with signal-jamming gear drives under antidrone nets near Oleksandriya, in eastern Ukraine, on 2 October 2025.ED JONES/AFP/GETTY IMAGES

“The attackers gain an immense advantage from unmanned systems,” says Lange. “You can have a drone pop up from anywhere and it can wreak havoc. But from autonomy, they gain even more.”

The self-navigating drones rely on image-recognition algorithms that have been around for over a decade, says Lange. And the mass deployments of drones on Ukrainian battlefields are enabling both Russian and Ukrainian technologists to create huge datasets that improve the training and precision of those AI algorithms.

Six-wheeled robotic vehicle with mounted equipment in a grassy field.A Ukrainian land robot, the Ravlyk, can be outfitted with a machine gun.

While uncrewed aerial vehicles (UAVs) have received the most attention, the Ukrainian military is also deploying dozens of different kinds of drones on land and sea. Ukraine, struggling with the shortage of infantry personnel, began working on replacing a portion of human soldiers with wheeled ground robots in 2024. As of early 2026, thousands of ground robots are crawling across the gray zone along the front line in Eastern Ukraine. Most are used to deliver supplies to the front line or to help evacuate the wounded, but some “killer” ground robots fitted with turrets and remotely controlled machine guns have also been tested.

In mid-February, Ukrainian authorities released a video of a Ukrainian ground robot using its thermal camera to detect a Russian soldier in the dark of the night and then kill the invader with a round from a heavy machine gun. So far these robots are mostly controlled by a human operator, but the makers of these uncrewed ground vehicles say their systems are capable of basic autonomous operations, such as returning to base when radio connection is lost. The goal is to enable them to swarm so that one operator controls not one, but a whole herd of mesh-connected killer robots.

But Bryan Clark, senior fellow and director of the Center for Defense Concepts and Technology at the Hudson Institute, questions how quickly ground robots’ abilities can progress. “Ground environments are very difficult to navigate in because of the terrain you have to address,” he says. “The line of sight for the sensors on the ground vehicles is really constrained because of terrain, whereas an air vehicle can see everything around it.”

To achieve autonomy, maritime drones, too, will require navigaitional approaches beyond AI-based image recognition, possibly based on star positions or electronic signals from radios and cell towers that are within reach, says Clark. Such technologies are still being developed or are in a relatively early operational stage.

How the Shaheds Got Better

Russia is not lagging behind. In fact, some analysts believe its autonomous systems may be slightly ahead of Ukraine’s. For a good example of the Russian military’s rapid evolution, they say, consider the long-range Iranian-designed Shahed drones. Since 2022, Russia has been using them to attack Ukrainian cities and other targets hundreds of kilometers from the front line. “At the beginning, Shaheds just had a frame, a motor, and an inertial navigation system,” Oleksii Solntsev, CEO of Ukrainian defense tech startup MaXon Systems, tells me. “They used to be imprecise and pretty stupid. But they are becoming more and more autonomous.” Solntsev founded MaXon Systems in late 2024 to help protect Ukrainian civilians from the growing threat of Shahed raids.

Silhouette of a triangular drone flying in the sky.A Russian Geran-2 drone, based on the Iranian Shahed-136, flies over Kyiv during an attack on 27 December 2025.SERGEI SUPINSKY/AFP/GETTY IMAGES

First produced in Iran in the 2010s, Shaheds can carry 90-kilogram warheads up to 650 km (50-kg warheads can go twice as far). They cost around

$35,000 per unit, compared to a couple of million dollars, at least, for a ballistic missile. The low cost allows Russia to manufacture Shaheds in high quantities, unleashing entire fleets onto Ukrainian cities and infrastructure almost every night.

The early Shaheds were able to reach a preprogrammed location based on satellite-navigation coordinates. Even one of these early models could frequently overcome the jamming of satellite-navigation signals with the help of an onboard inertial navigation unit. This was essentially a dead-reckoning system of accelerators and gyroscopes that estimate the drone’s position from continual measurements of its motions.

Silhouette of person with large equipment under a starry night sky.In the Donetsk Region, on 15 August 2025, a Ukrainian soldier hunts for Shaheds and other drones with a thermalimaging system attached to a ZU23 23-millimeter antiaircraft gun.KOSTYANTYN LIBEROV/LIBKOS/GETTY IMAGES

Ukrainian defense forces learned to down Shaheds with heavy machine guns, but as Russia continued to innovate, the daily onslaughts started to become increasingly effective.

Today’s Shaheds fly faster and higher, and therefore are more difficult to detect and take down. Between January 2024 and August 2025, the number of Shaheds and Shahed-type attack drones launched by Russia into Ukraine per month increased more than tenfold, from 334 to more than 4,000. In 2025, Ukraine found AI-enabling Nvidia chipsets in wreckages of Shaheds, as well as thermal-vision modules capable of locking onto targets at night.

“Now, they are interconnected, which allows them to exchange information with each other,” Solntsev says. “They also have cameras that allow them to autonomously navigate to objects. Soon they will be able to tell each other to avoid a jammed region or an area where one of them got intercepted.”

These Russian-manufactured Shaheds, which Russian forces call Geran-2s, are thought to be more capable than the garden variety Shahed-136s that Iran has lately been launching against targets throughout the Middle East. Even the relatively primitive Shahed-136s have done considerable damage, according to press accounts.

Those Shahed successes may accrue, at least in part, from the fact that the United States and Israel lack Ukraine’s long experience with fending them off. In just two days in early March, upward of a thousand drones, mostly Shaheds, were launched against U.S. and Israeli targets, with hundreds of them reportedly finding their marks.

One attack, caught on videotape, shows a Shahed destroying a radar dome at the U.S. navy base in Manama, Bahrain. U.S. forces were understood to be attempting to fend off the drones by striking launch platforms, dispatching fighter aircraft to shoot them down, and by using some extremely costly air-defense interceptors, including ones meant to down ballistic missiles. On 4 March, CNN reported that in a congressional briefing the day before, top U.S. defense officials, including Secretary of Defense Pete Hegseth, acknowledged that U.S. air defenses weren’t keeping up with the onslaught of Shahed drones.

Broken drone on soil, cylindrical container nearby.Russian V2U attack drones are outfitted with Nvidia processors and run computer-vision software and AI algorithms to enable the drones to navigate autonomously.GUR OF THE MINISTRY OF DEFENSE OF UKRAINE

Russia is also starting to field a newer generation of attack drones. One of these, the V2U, has been used to strike targets in the Sumy region of northeastern Ukraine. The V2U drones are outfitted with Nvidia Jetson Orin processors and run computervision software and AI algorithms that allow the drones to navigate even where satellite navigation is jammed.

The sale of Nvidia chips to Russia is banned under U.S. sanctions against the country. However, press reports suggest that the chips are getting to Russia via intermediaries in India.

Antidrone Systems Step Up

MaXon Systems is one of several companies working to fend off the nightly drone onslaught. Within one year, the company developed and battle-tested a Shahed interception system that hints at the sci-fi future envisioned by Azhnyuk. For a system to be capable of reliably defending against autonomous weaponry, it, too, needs to be autonomous.

MaXon’s solution consists of ground turrets scanning the sky with infrared sensors, with additional input from a network of radars that detects approaching Shahed drones at distances of, typically, 12 to 16 km. The turrets fire autonomous fixed-winged interceptor drones, fitted with explosive warheads, toward the approaching Shaheds at speeds of nearly 300 km/h. To boost the chances of successful interception, MaXon is also fielding an airborne anti-Shahed fortification system consisting of helium-filled aerostats hovering above the city that dispatch the interceptors from a higher altitude.

“We are trying to increase the level of automation of the system compared to existing solutions,” says Solntsev. “We need automatic detection, automatic takeoff, and automatic mid-track guidance so that we can guide the interceptor before it can itself flock the target.”

Gray drone on display stand, surrounded by military personnel in camouflage uniforms.An interceptor drone, part of the U.S. MEROPS defensive system, is tested in Poland on 18 November 2025.WOJTEK RADWANSKI/AFP/GETTY IMAGES

In November 2025, the Ukrainian military announced it had been conducting successful trials of the Merops Shahed drone interceptor system developed by the U.S. startup Project Eagle, another of former Google CEO Eric Schmidt’s Ukraine defense ventures. Like the MaXon gear, the system can operate largely autonomously and has so far downed over 1,000 Shaheds.

What Works in the Lab Doesn’t Necessarily Fly on the Battlefield

Despite the progress on both sides, analysts say that the kind of robotic warfare imagined by Azhnyuk won’t be a reality for years.

“The software for drone collaboration is there,” says Kate Bondar, a former policy advisor for the Ukrainian government and currently a research fellow at the U.S. Center for Strategic and International Studies. “Drones can fly in labs, but in real life, [the forces] are afraid to deploy them because the risk of a mistake is too high,” she adds.

Two people launching a drone in an open field using a catapult system.Ukrainian soldiers watch a GOR reconnaissance drone take to the sky near Pokrovsk in the Donetsk region, on 10 March 2025.ANDRIY DUBCHAK/FRONTLINER/GETTY IMAGES

In Bondar’s view, powerful AI-equipped drones won’t be deployed in large numbers given the current prices for high-end processors and other advanced components. And, she adds, the more autonomous the system needs to be, the more expensive are the processors and sensors it must have. “For these cheap attack drones that fly only once, you don’t install a high-resolution camera that [has] the resolution for AI to see properly,” she says. “[You install] the cheapest camera. You don’t want expensive chips that can run AI algorithms either. Until we can achieve this balance of technological sophistication, when a system can conduct a mission but at the lowest price possible, it won’t be deployed en masse.”

While existing AI systems are doing a good job recognizing and following large objects like Shaheds or tanks, experts question their ability to reliably distinguish and pursue smaller and more nimble or inconspicuous targets. “When we’re getting into more specific questions, like can it distinguish a Russian soldier from a Ukrainian soldier or at least a soldier from a civilian? The answer is no,” says Bondar. “Also, it’s one thing to track a tank, and it’s another to track infantrymen riding buggies and motorcycles that are moving very fast. That’s really challenging for AI to track and strike precisely.”

Clark, at the Hudson Institute, says that although the AI algorithms used to guide the Russian and Ukrainian drones are “pretty good,” they rely on information provided bysensors that “aren’t good enough.” “You need multiphenomenology sensors that are able to look at infrared and visual and, in some cases, different parts of the infrared spectrum to be able to figure out if something is a decoy or real target,” he says.

German defense analyst Lange agrees that right now, battlefield AI image-recognition systems are too easily fooled. “If you compress reality into a 2D image, a lot of things can be easily camouflaged—like what Russia did recently, when they started drawing birds on the back of their drones,” he says.

Autonomy Remains Elusive on the Ground and at Sea, Too

To make Ukraine’s emerging uncrewed ground vehicles (UGVs) equally self-sufficient will be an even greater task, in Clark’s view. Still,

Bondar expects major advances to materialize within the next several years, even if humans are still going to be part of the deci-sion-making loop.

Military radar equipment in a grassy field.A mobile electronic-warfare system built by PiranhaTech is demonstrated near Kyiv on 21 October 2025.DANYLO ANTONIUK/ANADOLU/GETTY IMAGES

“I think in two or three years, we will have pretty good full autonomy, at least in good weather conditions,” she says, referring to aerial drones in particular. “Humans will still be in the loop for some years, simply because there are so many unpredictable situations when you need an intervention. We won’t be able to fully rely on the machine for at least another 10 or 15 years.”

Ukrainian defenders are apprehensive about that autonomous future. The boom of drone innovation has come hand in hand with the development of sophisticated jamming and radio-frequency detection systems. But a lot of that innovation will become obsolete once the pendulum swings away from human control. Ukrainians got their first taste of dealing with unjammable drones in mid-2024, when Russia began rolling out fiber-optic tethered drones. Now they have to brace for a threat on a much larger scale.

Quadcopter drone flying with a fire extinguisher attached in a cloudy sky.An experimental drone is demonstrated at the Brave1 defense-tech incubator in Kyiv.DANYLO DUBCHAK/FRONTLINER/GETTY IMAGES

“Today, we have a situation where we have lots of signals on the battlefield, but in the near future, in maybe two to five years, UAVs are not going to be sending any signals,” says Oleksandr Barabash, CTO of Falcons, a Ukrainian startup that has developed a smart radio-frequency detection system capable of revealing precise locations of enemy radio sources such as drones, control stations, and jammers.

Last September, Falcons secured funding from the U.S.-based dual-use tech fund Green Flag Ventures to scale production of its technology and work toward NATO certification. But Barabash admits that its system, like all technologies fielded in Ukrainian war zones, has an expiration date. Instead of radio-frequency detectors, Barabash thinks, the next R&D push needs to focus on passive radar systems capable of identifying small and fast-moving targets based on the signal from sources like TV towers or radio transmitters that propagate through the environment and are reflected by those moving targets. Passive radars have a significant advantage in the war zone, according to Barabash. Since they don’t emit their own signal, they can’t be that easily discovered by the enemy.

“Active radar is emitting signals, so if you are using active radars, you are target No. 1 on the front line,” Barabash says.

Bondar, on the other hand, thinks that the increased onboard compute power needed for AI-controlled drones will, by itself, generate enough electromagnetic radiation to prevent autonomous drones from ever operating completely undetectably.

“You can have full autonomy, but you will still have systems onboard that emit electromagnetic radiation or heat that can be detected,” says Bondar. “Batteries emit electromagnetic radiation, motors emit heat, and [that heat can be] visible in infrared from far away. You just need to have the right sensors to be able to identify it in advance.” She adds that that takeaway is “how capable contemporary detection systems have become and how technically challenging it is to design drones that can reliably operate in the Ukrainian battlefield environment.”

There Will Be Nowhere to Hide from Autonomous Drones

When autonomous drones become a standard weapon of war, their threat will extend far beyond the battlefields of Ukraine. Autonomous turrets and drone-interceptor fortification might soon dot the perimeter of European cities, particularly in the eastern part of the continent.

Person holding gray drone against a blue sky, preparing to launch it.A fixed-wing drone is tested in Ukraine in April 2025.ANDREWKRAVCHENKO/BLOOMBERG/GETTY IMAGES

Nefarious actors from all over the world have closely watched Ukraine and taken notes, warns Lange. Today, FPV drones are being used by Islamic terrorists in Africa and Mexican drug cartels to fight against local authorities.

When autonomous killing machines become widely available, it’s likely that no city will be safe. “We might see nets above city centers, protecting civilian streets,” Lange says. “In every case, the West needs to start performing similar kinetic-defense development that we see in Ukraine. Very rapid iteration and testing cycles to find solutions.”

Azhnyuk is concerned that the historic defenders of Europe—the United States and the European countries themselves—are falling behind. “We are in danger,” he says. While Russia and Ukraine made major strides in their drones and countermeasures over the past year, “Europe and the United States have progressed, in the best-case scenario, from the winter-of-2022 technology to the summer-of-2022 technology.

“The gap is getting wider,” he warns. “I think the next few years are very dangerous for the security of Europe.”

Reference: https://ift.tt/Vh5dj8z

Monday, March 23, 2026

Remembering IEEE Power & Energy Society Leader Mel Olken




Mel Olken

Former executive director of the IEEE Power & Energy Society

Fellow, 92; died 9 January

Olken became the first executive director of the IEEE Power & Energy Society (PES) in 1995. In 2002 he left the position to serve as founding editor in chief of the society’s Power & Energy Magazine. Olken led the publication until 2016, when he retired.

After receiving a bachelor’s degree in engineering from the City College of New York, Olken was hired as an electrical engineer by American Electric Power, a utility based in Columbus, Ohio. He helped design coal, hydroelectric, and nuclear power plants. While at AEP, he was promoted to manager of the electrical generation department.

He joined IEEE in 1958 and became a PES member in 1973. An active volunteer, he chaired the society’s energy development and power generation committee and its technical council.

Olken was elected an IEEE Fellow in 1988 for “contributions to innovative design of reliable generating stations.”

He became an IEEE staff member in 1984 as society services director for IEEE Technical Activities. From 1990 to 1995 he served as managing director of Regional Activities group (now IEEE Member and Geographic Activities), before becoming PES executive director.

He received a PES Lifetime Achievement Award in 2012 for his “broad and sustained technical contributions to the development of power engineering and the power engineering profession.”

Stephanie A. Huguenin

Research scientist

IEEE member, 48; died 1 October

Huguenin was an administrative assistant in the physics and biophysics department at Augusta University, in Georgia. According to her Augusta obituary, she died of an illness acquired during her volunteer work in India.

She received a bachelor’s degree in engineering in 1999 from the College of Charleston, in South Carolina. During her senior year, she worked as a mathematics and science tutor at the Jenkins Orphanage (now the Jenkins Institute for Children), in North Charleston. After graduating, Huguenin traveled to India to volunteer at an orphanage run by the Mother Teresa Foundation.

Upon returning to the United States in 2001, Huguenin worked as a freelance research consultant. Three years later she was hired as a systems administrator and archivist by photographer Ebet Roberts in New York City. In 2010 she left to work as an operations strategist and technical consultant.

She earned a master’s degree in communication and research science in 2016 from New York University. While at NYU, she conducted experimental and theoretical research in Internet Protocol design and implementation as well as network security and management.

From 2020 to 2024 she was a research scientist at businesses owned by her family. She joined Augusta University in 2023.

She was a member of the IEEE Geoscience and Remote Sensing Society and the IEEE Systems Council.

Huguenin volunteered for the Internet Engineering Task Force, a standards development organization, and the American Registry for Internet Numbers. ARIN manages and distributes internet number resources such as IP addresses and autonomous system numbers.

The nonprofits she supported included the Coastal Conservation League, the Longleaf Alliance, the Lowcountry Land Trust, the Nature Conservancy, and Women in Defense.

Reference: https://ift.tt/EYbU7Zy

Google bumps up Q Day deadline to 2029, far sooner than previously thought

Google is dramatically shortening its deadline readiness for the arrival of Q Day, the point at which existing quantum computers can break...