Thursday, February 26, 2026

New AirSnitch attack breaks Wi-Fi encryption in homes, offices, and enterprises


It’s hard to overstate the role that Wi-Fi plays in virtually every facet of life. The organization that shepherds the wireless protocol says that more than 48 billion Wi-Fi-enabled devices have shipped since it debuted in the late 1990s. One estimate pegs the number of individual users at 6 billion, roughly 70 percent of the world’s population.

Despite the dependence and the immeasurable amount of sensitive data flowing through Wi-Fi transmissions, the history of the protocol has been littered with security landmines stemming both from the inherited confidentiality weaknesses of its networking predecessor, Ethernet (it was once possible for anyone on a network to read and modify the traffic sent to anyone else), and the ability for anyone nearby to receive the radio signals Wi-Fi relies on.

Ghost in the machine

In the early days, public Wi-Fi networks often resembled the Wild West, where ARP spoofing attacks that allowed renegade users to read other users' traffic were common. The solution was to build cryptographic protections that prevented nearby parties—whether an authorized user on the network or someone near the AP (access point)—from reading or tampering with the traffic of any other user.

Read full article

Comments

Reference : https://ift.tt/EQ4bitB

How Stupid Would It Be to Put Data Centers in Space?




What’s the difference between a stupid idea and a brilliant one? Sometimes, it just comes down to resources. Practically unlimited funds, like limitless thrust, can get even a mad idea off the ground.

And so it might be for the concept of putting AI data centers in orbit. In a rare moment of unalloyed agreement, some of the richest and most powerful men in technology are staunchly backing the idea. The group includes Elon Musk, Jeff Bezos, Jensen Huang, Sam Altman, and Google CEO Sundar Pichai. In all likelihood, hundreds of people are now working on the concept of space data centers at the firms directly or indirectly controlled by these men—SpaceX, Starlink, Tesla, Amazon, Blue Origin, Nvidia, OpenAI, and Google, among others.

Pie charts compare the costs of orbital solar\u2014$51.1billion\u2014vs. terrestrial data center\u2014$16 billion. Likely costs to design, build, and launch a 1-GW orbital datacenter, based on a network of some 4,400 satellites and including operating costs over a five-year period, would exceed US $50 billion. That’s about three times the cost of a 1-GW data center on Earth, including five years of operation.John MacNeill

So how much would it cost to start training large language models in space? Probably the best accounting is one created by aerospace engineer Andrew McCalip. McCalip’s exhaustive, detailed analysis includes interactive sliders that let you compare costs for space-based and terrestrial data centers in the range of 1 to 100 gigawatts. One-gigawatt data centers are being built now on terra firma, and Meta has announced plans for a 5-GW facility, with anticipated completion some time after 2030.

In an interview, McCalip says his initial rough calculations a few years ago suggested that data centers in space would cost in the range of 7 to 10 times more, per gigawatt of capacity, than their terrestrial counterparts. “It just wasn’t practical,” he says. “Not even close.” But when Elon Musk began publicly backing the idea, McCalip revisited the numbers using publicly available information about Starlink’s and Tesla’s technologies and capabilities.

That changed the picture substantially. The figures in his online analysis assume an orbital network of data-center satellites that borrows heavily from Musk’s tech treasure chest—“essentially…you just start putting some radiation-resistant ASIC chips on the Starlink fleet and you start growing edge capacity organically on the Starlink fleet,” McCalip says. The network would rely on the kind of watt-efficient GPU architecture used in Teslas for self-driving, he adds. “You start dropping those onto the backs of Starlinks. You can slowly grow this out, and this would be approximately the performance that you would get.”

Bottom line, with some solid but not necessarily heroic engineering, the cost of an orbital data center could be as low as three times that of the comparable terrestrial one. That differential, while still high, at least nudges the concept out of the instantly dismissible category. “I have my particular views, but I want the data to speak for itself,” McCalip says.

For this illustration, we picked a configuration with an aggregate 1 GW of capacity. The network would consist of some 4,300 satellites, each of which would be outfitted with a 1-square-kilometer solar array that generates 250 kilowatts. The data center on that satellite, powered by the array, might have at least 175 GPUs; McCalip notes that a popular GPU rack, Nvidia’s NVL72, has 72 GPUs and requires 120 to 140 kW.

The total cost of the satellite network would be around US $51 billion, including launch and five years of operational expenses; a comparable terrestrial system would cost about $16 billion over the same period.

Stupid? Not stupid? You decide.

Reference: https://ift.tt/ZjvlXdM

Achieving Micron-Level Tolerances: CAD Optimization for Sub-10µm 3D Printing




Achieve successful micro-scale 3D prints by optimizing tolerances, wall thickness, support strategies, microfluidic channels, and material selection in your CAD models from the start.

What Attendees will Learn

  1. Tolerance-driven design -- How to define resolution and tolerance constraints that translate directly from CAD intent to sub-10µm printed geometry.
  2. Geometry-aware fabrication -- Principles for engineering wall thickness, aspect ratios, and orientation to maintain structural fidelity at micron scale.
  3. Support-free design strategies -- Leveraging self-supporting geometries and build orientation to preserve feature integrity without post-processing trade-offs.
  4. Integrated material-process thinking -- Matching resin properties, shrinkage behavior, and export parameters to your application’s functional requirements.
Reference: https://ift.tt/XzeSbFI

Wednesday, February 25, 2026

How to Thrive as a Remote Worker




This article is crossposted from IEEE Spectrum’s careers newsletter. Sign up now to get insider tips, expert advice, and practical strategies, written in partnership with tech career development company Parsity and delivered to your inbox for free!

Standing Out as a Remote Worker Takes a Different Strategy

My first experience as a remote worker was a disaster.

Before I joined a San Francisco-based team with a lead developer in Connecticut, I had worked in person, five days a week. I thought success was simple: write good code, solve hard problems, deliver results. So I put my head down and worked harder than ever.

Twelve-hour days became normal as the boundary between work and personal life disappeared. My kitchen table became my office.

I rarely asked for help because I didn’t want to seem incompetent. I stayed quiet in team Slack channels because I wasn’t sure what to say.

Despite working some of the longest hours of my career, I made the slowest progress. I felt disconnected from the team. I had no idea if my work mattered or if anyone noticed what I was doing. I was burning out.

Eventually, I realized the real problem: I was invisible.

The Office Advantage You Lose When Remote

In an office, visibility happens naturally. Colleagues see you arrive early or stay late. They notice when you are stuck on a problem. They hear about your work in hallway conversations and over lunch. Physical presence creates recognition with almost no effort.

Remote work removes those signals. Your manager cannot see you at your desk. Your teammates don’t know you’ve hit a roadblock unless you say so. You can work long days and still appear less engaged than someone in the office.

That is the shift many people miss: Remote work requires execution plus deliberate communication.

What Actually Works

By my second remote role, I knew I had to change to protect my sanity and still succeed.

Here are five things I did that made a real difference.

1. Over-communicating

I began sharing updates in team channels regularly, not just when asked. “Working on the payment integration today; ready for review tomorrow.” “Hit a blocker with API rate limits; investigating options.” These took seconds but made my work visible and invited help sooner.

2. Setting limits

When your home is also your office, overwork becomes the default. I started ending most days at 5 p.m. and transitioning out of work mode with a walk or gym session. That ritual helped prevent burnout.

3. Volunteering for presentations

Presenting remotely felt less intimidating than standing in front of a room. I started volunteering for demos and lunch-and-learns. This increased my visibility beyond my immediate team and improved my communication skills.

4. Promoting others publicly

When someone helped me, I thanked them in a public channel. When a teammate shipped something impressive, I called it out. This builds goodwill and signals collaboration. In remote environments, gratitude is visible and memorable.

5. Building relationships deliberately

In an office, relationships form naturally. Remotely, you have to create those moments. I started an engineering book club that met every other week to discuss a technical book. It became a low-pressure way to connect with people across the organization.

The Counterintuitive Reality

With these habits, I got promoted faster in this remote job than I ever did in an office. I moved from senior engineer to engineering manager in under two years, while maintaining a better work-life balance.

Remote work offers flexibility and freedom, but it comes with a tax. You are easier to overlook and more likely to burn out unless you are intentional in your actions.

So, succeeding remotely takes deliberate effort in communication, relationships, and boundaries. If you do that well, remote work can unlock more opportunities than you might expect.

—Brian

This Former Physicist Helps Keep the Internet Secure

Despite its critical role in maintaining a secure network, authentication software often goes unnoticed by users. Alan DeKok now runs one of the most widely used remote authentication servers in the world—but he didn’t initially set out to work in cybersecurity. DeKok studied nuclear physics before starting the side project that eventually turned into a three-decade-long career.

Read more here.

More Than 30,000 Tech Employees Laid Off in 2026

We’re just two months into 2026, and layoffs in the tech industry are already ramping up. According to data compiled by RationalFX, more than half of the 30,700 layoffs this year have come from Amazon, which announced that it would be cutting the roles of 16,000 employees in late January. Will the trend continue through 2026?

Read more here.

IEEE Online Mini-MBA Aims to Fill Leadership Skills Gaps in AI

Recent research suggests that a majority of organizations have a significant gap when it comes to AI skills among leadership. To help fill the gap, IEEE has partnered with the Rutgers Business School to offer an online “mini-MBA” program, combining business strategy and deep AI literacy. The program spans 12 weeks and 10 modules that teach students how to implement AI strategies in their own organizations.

Read more here.

Reference: https://ift.tt/sSoWY9k

Jimi Hendrix Was a Systems Engineer




3 February 1967 is a day that belongs in the annals of music history. It’s the day that Jimi Hendrix entered London’s Olympic Studios to record a song using a new component. The song was “Purple Haze,” and the component was the Octavia guitar pedal, created for Hendrix by sound engineer Roger Mayer. The pedal was a key element of a complex chain of analog elements responsible for the final sound, including the acoustics of the studio room itself. When they sent the tapes for remastering in the United States, the sounds on it were so novel that they included an accompanying note explaining that the distortion at the end was not malfunction but intention. A few months later, Hendrix would deliver his legendary electric guitar performance at the Monterey International Pop Festival.

“Purple Haze” firmly established that an electric guitar can be used not just as a stringed instrument with built-in pickups for convenient sound amplification, but also as a full-blown wave synthesizer whose output can be manipulated at will. Modern guitarists can reproduce Hendrix’s chain using separate plug-ins in digital audio workstation software, but the magic often disappears when everything is buffered and quantized. I wanted to find out if a more systematic approach could do a better job and provide insights into how Hendrix created his groundbreaking sound.

My fascination with Hendrix’s Olympic Studios’ performance arose because there is a “Hendrix was an alien” narrative surrounding his musical innovation—that his music appeared more or less out of nowhere. I wanted to replace that narrative with an engineering-driven account that’s inspectable and reproducible—plots, models, and a signal chain from the guitar through the pedals that you can probe stage by stage.

Four plots showing magnitudes plotted against time and frequency. Each effects pedal in Hendrix’s chain contributed to enhancing the electric guitar beyond its intrinsic limits. A selection of plots from the full-circuit analysis shows how the Fuzz Face turns a sinusoid signal from a string into an almost square wave; how the Octavia pedal inverts half the input waveform to double its frequency; how the wah-wah pedal acts as band-pass filter; and how the Uni-Vibe pedal introduces selective phase shifts to color the sound.James Provost/ Rohan S. Puranik

Although I work mostly in the digital domain as an edge-computing architect in my day job, I knew that analog circuit simulations would be the key to going deeper.

My first step was to look at the challenges Hendrix was trying to address. Before the 1930s, guitars were too quiet for large ensembles. Electromagnetic pickups—coils of wire wrapped around magnets that detect the vibrations of metal strings—fixed the loudness problem. But they left a new one: the envelope, which specifies how the amplitude of a note varies as it’s played on an instrument, starting with a rising initial attack, followed by a falling decay, and then any sustain of the note after that. Electric guitars attack hard, decay fast, and don’t sustain like bowed strings or organs. Early manufacturers tried to modify the electric guitar’s characteristics by using hollow bodies fitted with magnetic pickups, but the instrument still barked more than it sang.

Hendrix’s mission was to reshape both the electric guitar’s envelope and its tone until it could feel like a human voice. He tackled the guitar’s constraints by augmenting it. His solution was essentially a modular analog signal chain driven not by knobs but by hands, feet, gain staging, and physical movement in a feedback field.

Hendrix’s setups are well documented: Set lists, studio logs, and interviews with Mayer and Eddie Kramer, then the lead engineer at Olympic Studios, fill in the details. The signal chain for “Purple Haze” consisted of a set of pedals—a Fuzz Face, the Octavia, and a wah-wah—plus a Marshall 100-watt amplifier stack, with the guitar and room acoustics closing a feedback loop that Hendrix tuned with his own body. Later, Hendrix would also incorporate a Uni-Vibe pedal for many of his tracks. All the pedals were commercial models except for the Octavia, which Mayer built to produce a distorted signal an octave higher than its input.

Hendrix didn’t speak in decibels and ohm values, but he collaborated with engineers who did.

I obtained the schematics for each of these elements and their accepted parameter ranges, and converted them into netlists that ngspice can process (ngpsice is an open source implementation of the Spice circuit analyzer). The Fuzz Face pedal came in two variants, using germanium or silicon transistors, so I created models for both. In my models, Hendrix’s guitar pickups had a resistance of 6 kiloohms and an inductance of 2.5 henrys with a realistic cable capacitance.

I chained the circuit simulations together using a script, and I produced data-plot and sample sound outputs with Python scripts. All of the ngspice files and other scripts are available in my GitHub repository at github.com/nahorov/Hendrix-Systems-Lab, with instructions on how to reproduce my simulations.

What Does The Analysis of Hendrix’s Signal Chain Tell Us?

Plotting the signal at different points in the chain with different parameters reveals how Hendrix configured and manipulated the nonlinear complexities of the system as a whole to reach his expressive goals.

A few highlights: First, the Fuzz Face is a two-transistor feedback amplifier that turns a gentle sinusoid signal into an almost binary “fuzzy” output. The interesting behavior emerges when the guitar’s volume is reduced. Because the pedal’s input impedance is very low (about 20 kΩ), the pickups interact directly with the pedal circuit. Reducing amplitude restores a sinusoidal shape—producing the famous “cleanup effect” that was a hallmark of Hendrix’s sound, where the fuzz drops in and out as desired while he played.

A photograph of three young men beside a recording studio mixing desk. The Jimi Hendrix Experience, (left to right) Mitch Mitchel, Jimi Hendrix, Noel ReddingFred W. McDarrah/Getty Images

Second, the Octavio pedal used a rectifier, which normally converts alternating to direct current. Mayer realized that a rectifier effectively flips each trough of a waveform into a peak, doubling the number of peaks per second. The result is an apparent doubling of frequency—a bloom of second-harmonic content that the ear hears a bright octave above the fundamental.

Third, the wah-wah pedal is a band-pass filter: Frequency plots show the center frequency sweeping from roughly 300 hertz to 2 kilohertz. Hendrix used it to make the guitar “talk” with vowel sounds, most iconically on “Voodoo Child (Slight Return).”

Fourth, the Uni-Vibe cascades four phase-shift sections controlled by photoresistors. In circuit terms, it’s a low-frequency oscillator modulating a variable-phase network; in musical terms it’s motion and air.

Finally, the whole chain became a closed loop by driving the Marshall amplifier near saturation, which among other things extends the sustain. In a reflective room, the guitar strings couple acoustically to the speakers—move a few centimeters and you shift from one stable feedback mode to another. To an engineer, this is a gain-controlled acoustic feedback system. To Hendrix, it was part of the instrument. He learned to tune oscillation with distance and angle, shaping sirens, bombs, and harmonics by walking the edge of instability.

Hendrix didn’t speak in decibels and ohm values, but he collaborated with engineers who did—Mayer and Kramer—and iterated fast as a systems engineer. Reframing Hendrix as an engineer doesn’t diminish the art. It explains how one person, in under four years as a bandleader, could pull the electric guitar toward its full potential by systematically augmenting the instrument’s shortcomings for maximum expression.

This article appears in the March 2026 print issue as “Jimi Hendrix, Systems Engineer.”

Reference: https://ift.tt/76Im1JV

Tuesday, February 24, 2026

This Physics Professor Credits Collaboration for Her Success




For Cinzia DaVià, collaboration isn’t just a buzzword. It’s the approach she applies to all her professional endeavors.

From her contributions to the development of a silicon sensor used in CERN (European Organization for Nuclear Research) particle accelerator experiments to her current research on portable energy generation solutions, there’s a common thread.

Cinzia DaVià


Employers

University of Manchester, England;

Stony Brook University, in New York

Job titles

Professor of physics; research professor

Member grade

Senior member

Alma maters

University of Bologna, Italy; University of Glasgow

As a professor of physics at the University of Manchester, in England, and a research professor at Stony Brook University, in New York, she has built strong connections across academic disciplines. Her continued involvement at CERN connects her with a broad array of professionals.

DaVià, an IEEE senior member, says she leverages her expertise and her network of collaborators to solve problems and build solutions. Her efforts include advancing high-energy particle experiments, improving cancer treatments, and mitigating the effects of climate change.

Collaboration is the foundation for any project’s success, she says. She credits IEEE for making many of her professional connections possible.

Even though she is the driving force behind building her alliances, she prefers to shine the spotlight on others, she says. For her, focusing on teamwork is more important than identifying individual contributions.

“The people involved in any project are really the ones to be celebrated,” she says. “The focus should be on them, not me.”

A career influenced by Italian television

As a young child growing up in the Italian Dolomites, her passion for physics was sparked by a popular documentary series, “Astronomia,” an Italian version of Carl Sagan’s renowned “Cosmos” series. The show was DaVià’s introduction to the world of astrophysics. She enrolled at Italy’s Alma Mater Studiorum/University of Bologna, confident she would pursue a degree in astronomy and astrophysics.

A summer internship at CERN in Geneva changed her career trajectory. She helped construct experiments for the Large Electron-Positron collider there. The LEP remains the largest electron-positron accelerator ever. An underground tunnel wide enough to accommodate the LEP’s 27-kilometer circumference was built on the CERN campus. It was Europe’s biggest civil engineering project at the time.

The LEP was designed to validate the standard model of physics, which until then was a theoretical framework that attempted to explain the universe’s building blocks. The experiments—which performed precision measurements of W and Z bosons, the positive and neutral bits central to particle physics—confirmed the standard model.

The LEP also paved the way, figuratively and literally, for CERN’s Large Hadron Collider. Following the LEP’s decommissioning in 2000, it was dismantled to make way for the LHC in the same underground testing tunnel.

As DaVià’s summer internship work on LEP experiments progressed, her professional focus shifted. Her plans to work in astrophysics gradually transitioned to a focus on radiation instrumentation.

After graduating in 1989 with a physics degree, she returned to CERN for a one-year assignment. As she got more involved in research and development for the large collider experiments, her one year turned into 10.

She received a CERN fellowship to help her finish her Ph.D. in physics at the University of Glasgow—which she received in 1997. Her work focused on radiation detectors and their applications in medicine.

“Nothing was programmed,” she says of her career trajectory. “It was always an opportunity that came after another opportunity, and things evolved along the way.”

A fusion of research and results

During her decade at CERN from 1989 to 1999, she contributed to several groundbreaking discoveries. One involved the radiation hardness of silicon sensors at cryogenic temperatures, referred to in physics as the Lazarus effect.

In the world of collider experiments, the silicon sensors function as eyes that capture the first moments of particle creation. The sensors are part of a larger detector unit that takes millions of images per second, helping scientists better understand particle creation.

In large collider experiments, the silicon sensors suffer significant damage from the radiation generated. After repeated exposure, the sensors eventually become nonfunctional.

DaVià’s contributions helped develop the process of reviving the dead detectors by cooling them down to temperatures below -143° C.

Her proudest professional accomplishment, she says, was a different discovery at CERN: Her research helped usher in a new era of large collider experiments.

For many years, researchers there used planar silicon sensors in collider experiments. But as the large colliders grew more sophisticated and capable, the traditional planar silicon design couldn’t withstand the extreme radiation present at the epicenter of collider collisions.

DaVià’s research contributed to the development, together with inventor Sherwood Parker, of 3D silicon sensors that could withstand extreme radiation.

The new sensors are radiation-resistant and exceptionally fast, she says.

Scientists began replacing planar sensors in the detectors deployed closest to the center of each collision. Planar detectors are still widely used in collider experiments but farther from direct impacts.

The development of the 3D silicon sensor was groundbreaking, but DaVià says she is proud of it for a different reason. The collaborative approach of the cross-functional R&D team she built is the most noteworthy outcome, she says.

Initially, people with conservative scientific views resisted the idea of creating a new sensor technology, she says. She was able to bring together a broad coalition of scientists, researchers, and industry leaders to work together, despite the initial skepticism and competing interests. The team included two companies that were direct competitors.

That type of industry collaboration was unheard of at the time, she says.

“I was able to convince them,” she says, “that working together would be the best and fastest way forward.”

Her approach succeeded. The two companies not only worked side by side but also exchanged proprietary information. They went so far as to agree that if something halted progress for one of them, it would ship everything to the other so production could continue.

DaVià coauthored a book about the project, Radiation Sensors With 3D Electrodes.

A focus on sustainable entrepreneurship

DaVià has long been concerned about the impact of extreme weather events, especially on underserved populations. Her interest transformed into action after she attended the American Institute of Architects International and AIA Japan Osaka World Expo last year.

During the symposium, held in June, panelists shared insights about natural disasters in their regions and identified steps that could help mitigate damage and protect lives.

The topics that particularly interested DaVià, she says, were excessive glacial melt in the Himalayas and the lack of tsunami warnings on remote Indonesian islands.

One of the ideas that surfaced during a brainstorming session was that of “smart shelters” that could be deployed in remote areas to assist in recovery efforts. The shelters would provide power and a means of communication during outages.

The concept was inspired by MOVE, an IEEE-USA initiative. The MOVE program provides communities affected by natural disasters with power and communications capabilities. The services are contained within MOVE vehicles and are powered by generators. A single MOVE vehicle can charge up to 100 phones, bolstering communication capabilities for relief agencies and disaster survivors.

DaVià’s knowledge of MOVE guided the evolution of the smart shelter concept. She recognized, however, that the challenge of powering portable shelters needed to be solved. She took the lead and formed a cross-disciplinary team of IEEE members and other professionals to make headway. One result is a planned two-day conference on sustainable entrepreneurship to be held at CERN in October.

“IEEE helps bring people together who might not otherwise connect.”

The goal of the conference, she says, is to “join the dots across different disciplines by involving as many IEEE societies and external experts as possible to work toward deployable solutions that help improve life for people around the world.”

The two-day event will include a competition focusing on solutions for sustainable energy generation and storage systems, she says, adding that entrepreneurs will share their ideas on the second day.

Her commitment to developing solutions to mitigate destruction caused by extreme weather led to her involvement with the IEEE Online Forum on Climate Change Technologies. She led the way in creating the Climate Change Initiative within the IEEE Nuclear and Plasma Sciences Society (NPSS).

She was the driving force behind securing funding for two of the society’s climate-related events. One was the 2024 Climate Workshop on Nuclear and Plasma Solutions for Energy and Society. The second event, building on the success of the first, was last year’s workshop: Nuclear and Plasma Opportunities for Energy and Society, held in conjunction with the Osaka World Expo.

New paths to guide others

DaVià reduced her involvement at CERN, when she joined the faculty at the University of Manchester as a physics professor. In 2016 she joined Stony Brook University as a research professor in the physics and astronomy department. She divides her time between the two schools.

She still maintains an office at CERN, where she works with students involved with particle physics. She is also an advisory board member of its IdeaSquare, an innovation space where science, technology, and entrepreneurial minds gather to brainstorm and test ideas. The goal is to identify ways to apply innovations generated by high-energy physics experiments to solve global challenges.

DaVià is the radiation detectors and imaging editor of Frontiers in Physics and a cochair of the European Union’s ATTRACT initiative, which promotes radiation imaging research across the continent. She is an active member of the European Physical Society, and she is an IEEE liaison officer for the physics and industry working group of the International Union of Pure and Applied Physics.

She has coauthored more than 900 publications.

IEEE as the connector

DaVià’s involvement with IEEE dates back to her undergraduate years, when she was introduced to the organization at a conference sponsored by the IEEE NPSS.

As her career grew, so did her involvement with IEEE.

She remains active with the society as a distinguished lecturer. She is a member of the IEEE Society of Social Implications of Technology, the IEEE Power & Energy Society, and the IEEE Women in Engineering group. She received the 2022 WIE Outstanding Volunteer of the Year Award.

She stays involved in IEEE to help her understand the work being done within each society and identify opportunities for cross-collaboration, she says. She sees such synergies as a key benefit of membership.

“IEEE helps bring people together who might not otherwise connect,” she says. “We are stronger together with IEEE.”

Reference: https://ift.tt/CNfEik9

Your Watch Will One Day Track Blood Pressure




Your smartwatch can track a lot of things, but at least for now, it can’t keep an accurate eye on your blood pressure. Last week researchers from University of Texas at Austin showed a way you smartwatch someday could. They were able to discern blood pressure by reflecting radio signals off a person’s wrist, and they plan to integrate the electronics that did it into a smartwatch in a couple of years.

Beside the tried-and-true blood pressure cuff, researchers in general have found several new ways to monitor blood pressure using pasted-on ultrasound transducers, electrocardiogram sensors, bioimpedance measurements, photoplethysmography, and combinations of these measurements.

“We found that existing methods all face limitations,” Yiming Han, a doctoral candidate in the lab of Yaoyao Jia told engineers at the IEEE International Solid State Circuits Conference (ISSCC) last week in San Francisco. For example, ultrasound sensing requires long-term contact with the skin. And as cool as electronic tattoos seem, they’re not as convenient or comfortable as a smartwatch. Photoplethysmography, which detects the oxygenation state of blood using light, doesn’t need direct contact, and indeed researchers in Tehran and California recently used it and a heavy dose of machine learning to monitor blood pressure. However, these sensors are thought to be sensitive to a person’s skin tone and were blamed for Black people in the United States getting inadequate treatment during the COVID-19 pandemic.

The University of Texas team sought a non-contact solution that was immune to skin-tone bias and could be integrated into a small device.

Continuous Blood Pressure Monitoring

Blood pressure measurements consist of two readings—systole, the peak pressure when the heart contracts and forces blood into arteries, and diastole, the phase in between heart contractions when pressure drops. During systole, blood vessels expand and stiffen and blood velocity increases. The opposite occurs in diastole.

All these changes alter conductivity, dielectric properties, and other tissue properties, so they should show up in reflected near-field radio waves, Jia’s colleague Deji Akinwande reasoned. Near-field waves are radiation impacting a surface that is less than one wavelength from the radiation’s source.

The researchers were able to test this idea using a common laboratory instrument called a vector network analyzer. Among its abilities, the analyzer can sense RF reflection, and the team was able to quickly correlate the radio response to blood pressure measured using standard medical equipment.

What Akinwande and Jia’s team saw was this: During systole, reflected near-field waves were more strongly out of phase with the transmitted radiation, while in diastole the reflections were weaker and closer to being in phase with the transmission.

You obviously can’t lug around a US $50,000 analyzer just to keep track of your blood pressure, so the team created a wearable system to do the job. It consists of a patch antenna strapped to a person’s wrist. The antenna connects to a device called a circulator—a kind of traffic roundabout for radio signals that steers outgoing signals to the antenna and signals coming in from the antenna to a separate circuit. A custom-designed integrated circuit feeds a 2.4 gigahertz microwave signal into one of the circulator’s on-ramps and receives, amplifies, and digitizes the much weaker reflection coming in from another branch. The whole system consumes just 3.4 milliwatts.

“Our work is the only one to provide no skin contact and no skin-tone bias,” Han said.

The next version of the device will use multiple radio frequencies to increase accuracy, says Jia, “because different people’s tissue conditions are different” and some might respond better to one or another. Like the 2.4 gigahertz used in the prototype these other frequencies will be of the sort already in common use such as 5 GHz (a Wi-Fi frequency) and 915 megahertz (a cellular frequency).

Following those experiments, Jia’s team will turn to building the device into a smartwatch form factor and testing them more broadly for possible commercialization.

Reference: https://ift.tt/WyZHrtP

Monday, February 23, 2026

AI’s Math Tricks Don’t Work for Scientific Computing




AI has driven an explosion of new number formats—the ways in which numbers are represented digitally. Engineers are looking at every possible way to save computation time and energy, including shortening the number of bits used to represent data. But what works for AI doesn’t necessarily work for scientific computing, be it for computational physics, biology, fluid dynamics, or engineering simulations. IEEE Spectrum spoke with Laslo Hunhold, who recently joined Barcelona-based Openchip as an AI engineer, about his efforts to develop a bespoke number format for scientific computing.

LASLO HUNHOLD


Laslo Hunhold is a senior AI accelerator engineer at Barcelona-based startup Openchip. He recently completed a Ph.D. in computer science from the University of Cologne, in Germany.

What makes number formats interesting to you?

Laslo Hunhold: I don’t know another example of a field that so few are interested in but has such a high impact. If you make a number format that’s 10 percent more [energy] efficient, it can translate to all applications being 10 percent more efficient, and you can save a lot of energy.

Why are there so many new number formats?

Hunhold: For decades, computer users had it really easy. They could just buy new systems every few years, and they would have performance benefits for free. But this hasn’t been the case for the last 10 years. In computers, you have a certain number of bits used to represent a single number, and for years the default was 64 bits. And for AI, companies noticed that they don’t need 64 bits for each number. So they had a strong incentive to go down to 16, 8, or even 2 bits [to save energy]. The problem is, the dominating standard for representing numbers in 64 bits is not well designed for lower bit counts. So in the AI field, they came up with new formats which are more tailored toward AI.

Why does AI need different number formats than scientific computing?

Hunhold: Scientific computing needs high dynamic range: You need very large numbers, or very small numbers, and very high accuracy in both cases. The 64-bit standard has an excessive dynamic range, and it is many more bits than you need most of the time. It’s different with AI. The numbers usually follow a specific distribution, and you don’t need as much accuracy.

What makes a number format “good”?

Hunhold: You have infinite numbers but only finite bit representations. So you need to decide how you assign numbers. The most important part is to represent numbers that you’re actually going to use. Because if you represent a number that you don’t use, you’ve wasted a representation. The simplest thing to look at is the dynamic range. The next is distribution: How do you assign your bits to certain values? Do you have a uniform distribution, or something else? There are infinite possibilities.

What motivated you to introduce the takum number format?

Hunhold: Takums are based on posits. In posits, the numbers that get used more frequently can be represented with more density. But posits don’t work for scientific computing, and this is a huge issue. They have a high density for [numbers close to one], which is great for AI, but the density falls off sharply once you look at larger or smaller values. People have been proposing dozens of number formats in the last few years, but takums are the only number format that’s actually tailored for scientific computing. I found the dynamic range of values you use in scientific computations, if you look at all the fields, and designed takums such that when you take away bits, you don’t reduce that dynamic range

This article appears in the March 2026 print issue as “Laslo Hunhold.”

Reference: https://ift.tt/BDlCua2

The Age Verification Trap




Social media is going the way of alcohol, gambling, and other social sins: societies are deciding it’s no longer kids’ stuff. Lawmakers point to compulsive use, exposure to harmful content, and mounting concerns about adolescent mental health. So, many propose to set a minimum age, usually 13 or 16.

In cases when regulators demand real enforcement rather than symbolic rules, platforms run into a basic technical problem. The only way to prove that someone is old enough to use a site is to collect personal data about who they are. And the only way to prove that you checked is to keep the data indefinitely. Age-restriction laws push platforms toward intrusive verification systems that often directly conflict with modern data-privacy law.

This is the age-verification trap. Strong enforcement of age rules undermines data privacy.

How Does Age Enforcement Actually Work?

Most age-restriction laws follow a familiar pattern. They set a minimum age and require platforms to take “reasonable steps” or “effective measures” to prevent underage access. What these laws rarely spell out is how platforms are supposed to tell who is actually over the line. At the technical level, companies have only two tools.

The first is identity-based verification. Companies ask users to upload a government ID, link a digital identity, or provide documents that prove their age. Yet in many jurisdictions, 16-year-olds do not have IDs. In others, IDs exist but are not digital, not widely held, or not trustworthy. Storing copies of identity documents also creates security and misuse risks.

The second option is inference. Platforms try to guess age based on behavior, device signals, or biometric analysis, most commonly facial age estimation from selfies or videos. This avoids formal ID collection, but it replaces certainty with probability and error.

In practice, companies combine both. Self-declared ages are backed by inference systems. When confidence drops, or regulators ask for proof of effort, inference escalates to ID checks. What starts as a light-touch checkpoint turns into layered verification that follows users over time.

What Are Platforms Doing Right Now?

This pattern is already visible on major platforms.

Meta has deployed facial age estimation on Instagram in multiple markets, using video-selfie checks through third-party partners. When the system flags users as possibly underaged, it prompts them to record a short selfie video. An AI system estimates their age and, if it decides they are under the threshold, restricts or locks the account. Appeals often trigger additional checks, and misclassifications are common.

TikTok has confirmed that it also scans public videos to infer users’ ages. Google and YouTube rely heavily on behavioral signals tied to viewing history and account activity to infer age, then ask for government ID or a credit card when the system is unsure. A credit card functions as a proxy for adulthood, even though it says nothing about who is actually using the account. The Roblox games site, which recently launched a new age-estimate system, is already suffering from users selling child-aged accounts to adult predators seeking entry to age-restricted areas, Wired reports.

For a typical user, age is no longer a one-time declaration. It becomes a recurring test. A new phone, a change in behavior, or a false signal can trigger another check. Passing once does not end the process.

How Do Age Verification Systems Fail?

These systems fail in predictable ways.

False positives are common. Platforms identify as minors adults with youthful faces, or who are sharing family devices, or have otherwise unusual usage. They lock accounts, sometimes for days. False negatives also persist. Teenagers learn quickly how to evade checks by borrowing IDs, cycling accounts, or using VPNs.

The appeal process itself creates new privacy risks. Platforms must store biometric data, ID images, and verification logs long enough to defend their decisions to regulators. So if an adult who is tired of submitting selfies to verify their age finally uploads an ID, the system must now secure that stored ID. Each retained record becomes a potential breach target.

Scale that experience across millions of users, and you bake the privacy risk into how platforms work.

Is Age Verification Compatible with Privacy Law?

This is where emerging age-restriction policy collides with existing privacy law.

Modern data-protection regimes all rest on similar ideas: collect only what you need, use it only for a defined purpose, and keep it only as long as necessary.

Age enforcement undermines all three.

To prove they are following age verification rules, platforms must log verification attempts, retain evidence, and monitor users over time. When regulators or courts ask whether a platform took reasonable steps, “we collected less data” is rarely persuasive. For companies, defending themselves against accusations of neglecting to properly verify age supersedes defending themselves against accusations of inappropriate data collection.

It is not an explicit choice by voters or policymakers, but instead a reaction to enforcement pressure and how companies perceive their litigation risk.

Less Developed Countries, Deeper Surveillance

Outside wealthy democracies, the tradeoff is even starker.

Brazil’s Statute of Child-rearing and Adolescents (ECA in Portuguese) imposes strong child-protection duties online, while its data protection law restricts data collection and processing. Now providers operating in Brazil must adopt effective age-verification mechanisms and can no longer rely on self-declaration alone for high-risk services. Yet they also face uneven identity infrastructure and widespread device sharing. To compensate, they rely more heavily on facial estimation and third-party verification vendors.

In Nigeria many users lack formal IDs. Digital service providers fill the gap with behavioral analysis, biometric inference, and offshore verification services, often with limited oversight. Audit logs grow, data flows expand, and the practical ability of users to understand or contest how companies infer their age shrinks accordingly. Where identity systems are weak, companies do not protect privacy. They bypass it.

The paradox is clear. In countries with less administrative capacity, age enforcement often produces more surveillance, not less, because inference fills the void of missing documents.

How Do Enforcement Priorities Change Expectations?

Some policymakers assume that vague standards preserve flexibility. In the U.K., then–Digital Secretary Michelle Donelan, argued in 2023 that requiring certain online safety outcomes without specifying the means would avoid mandating particular technologies. Experience suggests the opposite.

When disputes reach regulators or courts, the question is simple: can minors still access the platform easily or not? If the answer is yes, authorities tell companies to do more. Over time, “reasonable steps” become more invasive.

Repeated facial scans, escalating ID checks, and long-term logging become the norm. Platforms that collect less data start to look reckless by comparison. Privacy-preserving designs lose out to defensible ones.

This pattern is familiar, including online sales tax enforcement. After courts settled that large platforms had an obligation to collect and remit sales taxes, companies began continuous tracking and storage of transaction destinations and customer location signals. That tracking is not abusive, but once enforcement requires proof over time, companies build systems to log, retain, and correlate more data. Age verification is moving the same way. What begins as a one-time check becomes an ongoing evidentiary system, with pressure to monitor, retain, and justify user-level data.

The Choice We Are Avoiding

None of this is an argument against protecting children online. It is an argument against pretending there is no tradeoff.

Some observers present privacy-preserving age proofs involving a third party, such as the government, as a solution, but they inherit the same structural flaw: many users who are legally old enough to use a platform do not have government ID. In countries where the minimum age for social media is lower than the age at which ID is issued, platforms face a choice between excluding lawful users and monitoring everyone. Right now, companies are making that choice quietly, after building systems and normalizing behavior that protects them from the greater legal risks. Age-restriction laws are not just about kids and screens. They are reshaping how identity, privacy, and access work on the Internet for everyone.

The age-verification trap is not a glitch. It is what you get when regulators treat age enforcement as mandatory and privacy as optional.

Reference: https://ift.tt/WTQId71

Sunday, February 22, 2026

Poem: The Attraction of Blackberries




The first time she tried to seduce me,
(atoms falling in a vacuum)
she asked about blackberries—
(every mass exerts some gravity)

Did I know their season, where they grow?
(galvanometers, gravimeters)
I could answer both easily—
(tools to measure small attractions)

down the dirt road in September.
(devices that report, don’t interfere)
She eagerly went there with me,
(variations in readings occur)

We ate more berries than we kept.
(electron exchange may explain this)
The sweet dark juice painted our lips.
(equilibrium then entropy)

Reference: https://ift.tt/VFDYNuB

Saturday, February 21, 2026

AI Data Centers Turn to High-Temperature Superconductors




Data centers for AI are turning the world of power generation on its head. There isn’t enough power capacity on the grid to even come close to how much energy is needed for the number being built. And traditional transmission and distribution networks aren’t efficient enough to take full advantage of all the power available. According to the U.S. Energy Information Administration (EIA), annual transmission and distribution losses average about 5 percent. The rate is much higher in some other parts of the world. Hence, hyperscalers such as Amazon Web Services, Google Cloud and Microsoft Azure are investigating every avenue to gain more power and raise efficiency.

Microsoft, for example, is extolling the potential virtues of high-temperature superconductors (HTS) as a replacement for copper wiring. According to the company, HTS can improve energy efficiency by reducing transmission losses, increasing the resiliency of electrical grids, and limiting the impact of data centers on communities by reducing the amount of space required to move power.

“Because superconductors take up less space to move large amounts of power, they could help us build cleaner, more compact systems,” Alastair Speirs, the general manager of global infrastructure at Microsoft wrote in a blog post.

Superconductors Revolutionize Power Efficiency

Copper is a good conductor, but current encounters resistance as it moves along the line. This generates heat, lowers efficiency, and restricts how much current can be moved. HTS largely eliminates this resistance factor, as it’s made of superconducting materials that are cooled to cryogenic temperatures. (Despite the name, high-temperature superconductors still rely on frigid temperatures—albeit significantly warmer than those required by traditional superconductors.)

The resulting cables are smaller and lighter than copper wiring, don’t lower voltage as they transmit current, and don’t produce heat. This fits nicely into the needs of AI data centers that are trying to cram massive electrical loads into a tiny footprint. Fewer substations would also be needed. According to Speirs, next-gen superconducting transmission lines deliver capacity that is an order of magnitude higher than conventional lines at the same voltage level.

Microsoft is working with partners on the advancement of this technology including an investment of US $75 million into Veir, a superconducting power technology developer. Veir’s conductors use HTS tape, most commonly based on a class of materials known as rare-earth barium copper oxide (REBCO). REBCO is a ceramic superconducting layer deposited as a thin film on a metal substrate, then engineered into a rugged conductor that can be assembled into power cables.

“The key distinction from copper or aluminum is that, at operating temperature, the superconducting layer carries current with almost no electrical resistance, enabling very high current density in a much more compact form factor,” says Tim Heidel, Veir’s CEO and co-founder.

Liquid Nitrogen Cooling in Data Centers

A man poses in front of a server rack next to a large display showing graphs. Ruslan Nagimov, the principal infrastructure engineer for Cloud Operations and Innovation at Microsoft, stands near the world’s first HTS-powered rack prototype.Microsoft

HTS cables still operate at cryogenic temperatures, so cooling must be integrated into the power delivery system design. Veir maintains a low operating temperature using a closed-loop liquid nitrogen system: The nitrogen circulates through the length of the cable, exits at the far end, is re-cooled, and then recirculated back to the start.

“Liquid nitrogen is a plentiful, low cost, safe material used in numerous critical commercial and industrial applications at enormous scale,” says Heidel. “We are leveraging the experience and standards for working with liquid nitrogen proven in other industries to design stable, data center solutions designed for continuous operation, with monitoring and controls that fit critical infrastructure expectations rather than lab conditions.”

HTS cable cooling can either be done within the data center or externally. Heidel favors the latter as that minimizes footprint and operational complexity indoors. Liquid nitrogen lines are fed into the facility to serve the superconductors. They deliver power to where it’s needed and the cooling system is managed like other facility subsystem.

Rare earth materials, cooling loops, cryogenic temperatures—all of this adds considerably to costs. Thus, HTS isn’t going to replace copper in the vast majority of applications. Heidel says the economics are most compelling where power delivery is constrained by space, weight, voltage drop, and heat.

“In those cases, the value shows up at the system level: smaller footprints, reduced resistive losses, and more flexibility in how you route power,” says Heidel. “As the technology scales, costs should improve through higher-volume HTS tape manufacturing and better yields, and also through standardization of the surrounding system hardware, installation practices, and operating playbooks that reduce design complexity and deployment risk.”

AI data centers are becoming the perfect proving ground for this approach. Hyperscalers are willing to spend to develop higher-efficiency systems. They can balance spending on development against the revenue they might make by delivering AI services broadly.

“HTS manufacturing has matured—particularly on the tape side—which improves cost and supply availability,” says Husam Alissa, Microsoft’s director of systems technology. “Our focus currently is on validating and derisking this technology with our partners with focus on systems design and integration.”

Reference: https://ift.tt/fhUlaAu

Friday, February 20, 2026

IEEE Plays a Pivotal Role In Climate Mitigation Talks




IEEE has enhanced its standing as a trusted, neutral authority on the role of technology in climate change mitigation and adaption. Last year it became the first technical association to be invited to a U.N. Conference of the Parties on Climate Change.

IEEE representatives participated in several sessions at COP30, held from 11 to 20 November in Belém, Brazil. More than 56,000 delegates attended, including policymakers, technologists, and representatives from industry, finance, and development agencies.

Following the conference, IEEE helped host the selective International Symposium on Achieving a Sustainable Climate. The International Telecommunication Union and IEEE hosted ISASC on 16 and 17 December at ITU’s headquarters in Geneva. Among the more than 100 people who attended were U.N. agency representatives, diplomats, senior leaders from academia, and experts from government, industry, nongovernment organizations, and standards development bodies.

Power and energy expert Saifur Rahman, the 2023 IEEE president, led IEEE’s delegation at both events. Rahman is the immediate past chair of IEEE’s Technology for a Sustainable Climate Matrix Organization, which coordinates, communicates, and amplifies the organization’s efforts.

IEEE’s evolving role at COP

IEEE first attended a COP in 2021.

“Over successive COPs, IEEE’s role has evolved from contributing individual technical sessions to being recognized as a trusted partner in climate action,” Rahman noted in a summary of COP30. “There is [a] growing demand for engineering insight, not just to discuss technologies but [also] to help design pathways for deployment, capacity-building, and long-term resilience.”

Joining Rahman at COP30 were IEEE Fellow Claudio Canizares and IEEE Member Filipe Emídio Tôrres.

Canizares is a professor of electrical and computer engineering at the University of Waterloo, in Ontario, Canada, and the executive director of the university’s sustainable energy institute.

Tôrres chairs the IEEE Centro-Norte Brasil Section (Brazil Chapter). An entrepreneur and a former professor, he is pursuing a Ph.D. in biomedical engineering at the University of Brasilia. He also represented the IEEE Young Professionals group while attending the conference.

In the Engineering for Climate Resilience: Water Planning, Energy Transition, Biodiversity session, Rahman showed a video from his 2024 visit to Shennongjia, China, where he monitored a clean energy project designed to protect endangered snub-nosed monkeys from human encroachment. The project integrates renewable energy, which helps preserve the forest and its wildlife.

Rahman also chaired a session at the Sustainable Development Goal Pavilion on balancing decarbonization efforts between industrialized and emerging economies.

Additionally, he participated in a joint panel discussion hosted by IEEE and the World Federation of Engineering Organizations on engineering strategies for climate resilience, including energy transition and biodiversity.

Rahman, Canizares, and Tôrres took part in a session on clean-tech solutions for a sustainable climate, hosted by the International Youth Nuclear Congress. The topics included fossil fuel–free electricity for communications in remote areas and affordable electricity solutions for off-grid areas.

The three also joined several panels organized by the IYNC that addressed climate resilience, career pathways in sustainability, and a mentoring program.

“Over successive COPs, IEEE’s role has evolved from contributing individual technical sessions to being recognized as a trusted partner in climate action.” —Saifur Rahman, 2023 IEEE president

The IYNC hosted the Voices of Transition: Including Pathways to a Clean Energy Future session, for which Tôrres and Rahman were panelists. They discussed the need to include underrepresented and marginalized groups, which often get overlooked in projects that convert communities to renewable energy.

Rahman, Canizares, and Tôrres visited the COP Village, where they met several of the 5,000 Indigenous leaders participating in the conference and discussed potential partnerships and collaborations. Climate change has made the land where the Indigenous people live more susceptible to severe droughts and wildfires, particularly in the Amazon region.

Rahman and Tôrres took a field trip to the Federal University of Para, where they met several faculty members and students and toured the LASSE engineering lab.

A meaningful experience

Tôrres, who says representing IEEE at COP30 was transformative, wrote a detailed report about the event.

“The experience reaffirmed my belief that engineering and technology, when combined with respect for cultural diversity, can play a critical role in shaping a more sustainable and equitable world,” he wrote. “It highlighted the importance of combining cutting-edge technological solutions with Indigenous wisdom and cultural knowledge to address the climate crisis.”

COP30 webinar

Rahman and Canizares give an overview of their COP30 experiences in an IEEE webinar.

“IEEE has a place at the table,” Rahman says in the video. “We want to showcase outside our comfort zone what IEEE can do. We go to all these global events so that our name becomes a familiar term. We are the first technical association organization ever to go to COP and talk about engineering.”

Canizares added that IEEE is now collaborating closely with the United Nations.

“This is an important interaction. And I think, moving forward, IEEE will become more relevant, particularly in the context of technology deployment,” he said. “As governments start technology deployments, they will see IEEE as a provider of solutions.”

ISASC takeaways

Rahman was the general chair of the ISASC event, which focused on the delivery and deployment of clean energy. Among the presenters were IEEE members including Canizares, Paulina Chan, Surekha Deshmukh, Ashutosh Dutta, Tariq Durrani, Samina Husain, Bruce Kraemer, Bruno Meyer, Carlo Alberto Nucci, and Seizo Onoe.

Sessions were organized around six themes: energy transition, information and communication technology, financing, case studies, technical standards, and public-private collaborations. A detailed report includes the discussions, insights, and opportunities identified throughout ISASC.

Here are some key takeaways.

  • Although the technology exists to transition to renewable energy, most power grid systems are not ready. Deployment is increasingly constrained by transmission bottlenecks, interconnection delays, permitting challenges, and system flexibility. There’s also a skills shortage.
  • Energy transition pathways must be region-specific and should consider local resources, social conditions, funding opportunities, and development priorities.
  • Information and communication technologies are central to climate mitigation solutions, despite growing concerns about their environmental impact. Even though the technologies are used in beneficial ways, such as early-warning systems for natural disasters and smart water management, they also are driving the rapid growth of data centers for artificial intelligence applications—which has increased energy prices and driven up water demand.
  • Technical standards are a means of accelerating adoption, interoperability, and trust in green technology. There needs to be greater coordination among standards development organizations, particularly at the convergence of energy systems, information technologies, and AI. Fragmented standards hinder interoperability. The lack of technical standards is a major constraint on project financing, limiting investors’ confidence and slowing technology deployment.
  • Training and outreach efforts are important for successfully implementing standards, especially in developing regions. IEEE’s global membership and regional sections can be critical channels to address the needs.

A technology assessment tool

As part of ISASC, IEEE presented a technology assessment tool prototype. The web-based platform is designed to help policymakers, practitioners, and investors compare technology options against climate goals.

The tool can run a comparative analysis of sustainable climate technologies and integrate publicly available, expert-validated data.

IEEE can help the world meet its goals

The ISASC report concluded that by connecting engineering expertise with real-world deployment challenges, IEEE is working to translate global climate goals into measurable actions.

The discussions highlighted that the path forward lies less in inventing new technologies and more in aligning systems to deliver ones that already exist.

Summaries of COP30 and ISASC are available on the IEEE Technology for a Sustainable Climate website.

Reference: https://ift.tt/bIeEuvS

Video Friday: Humanoid Robots Celebrate Spring




Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA 2026: 1–5 June 2026, VIENNA

Enjoy today’s videos!

So, humanoid robots are nearing peak human performance. I would point out, though, that this is likely very far from peak robot performance, which has yet to be effectively exploited, because it requires more than just copying humans.

[ Unitree ]

“The Street Dance of China” Turning lightness into gravity, and rhythm into impact.This is a head-on collision between metal and beats. This Chinese New Year, watch PNDbotics Adam bring the heat with a difference.

[ PNDbotics ]

You had me at robot pandas.

[ MagicLab ]

NASA’s Perseverance rover can now precisely determine its own location on Mars without waiting for human help from Earth. This is possible thanks to a new technology called Mars Global Localization. This technology rapidly compares panoramic images from the rover’s navigation cameras with onboard orbital terrain maps. It’s done with an algorithm that runs on the rover’s Helicopter Base Station processor, which was originally used to communicate with the Ingenuity Mars Helicopter. In a few minutes, the algorithm can pinpoint Perseverance’s position to within about 10 inches (25 centimeters). The technology will help the rover drive farther autonomously and keep exploring.

[ NASA Jet Propulsion Laboratory ]

Legs? Where we’re going, we don’t need legs!

[ Paper ]

This is a bit of a tangent to robotics, but it gets a pass because of the cute jumping spider footage.

[ Berkeley Lab ]

Corvus One for Cold Chain is engineered to live and operate in freezer environments permanently, down to -20°F, while maintaining full flight and barcode scanning performance.

I am sure there is an excellent reason for putting a cold storage facility in the Mojave desert.

[ Corvus Robotics ]

The video documents the current progress made in the picking rate of the Shiva robot when picking strawberries. It first shows the previous status, then the further development, and finally the field test.

[ DFKI ]

Data powers an organization’s digital transformation, and ST Engineering MRAS is leveraging Spot to get a full view of critical equipment and facility. Working autonomously, Spot collects information about machine health - and now, thanks to an integration of the Leica BLK ARC for reality capture, detailed and accurate point cloud data for their digital twin.

[ Boston Dynamics ]

The title of this video is “Get out and have fun!” Is that mostly what humanoid robots are good for right now, pretty much...?

[ Engine AI ]

ASTORINO is a modern 6-axis robot based on 3D printing technology. Programmable in AS-language, it facilitates the preparation of classes with ready-made teaching materials, is easy both to use and to repair, and gives the opportunity to learn and make mistakes without fear of breaking it.

[ Kawasaki ]

Can I get this in my living room?

[ Yaskawa ]

What does it mean to build a humanoid robot in seven months, and the next one in just five? This documentary takes you behind the scenes at Humanoid, a UK-based AI and robotics company building reliable, safe, and helpful humanoid robots. You’ll hear directly from our engineering, hardware, product, and other teams as they share their perspectives on the journey of turning physical AI into reality.

[ Humanoid ]

This IROS 2025 keynote is from Tim Chung who is now at Microsoft, on “Catalyzing the Future of Human, Robot, and AI Agent Teams in the Physical World.”

The convergence of technologies—from foundation AI models to diverse sensors and actuators to ubiquitous connectivity—is transforming the nature of interactions in the physical and digital world. People have accelerated their collaborative connections and productivity through digital and immersive technologies, no longer limited by geography or language or access. Humans have also leveraged and interacted with AI in many different forms, with the advent of hyperscale AI models (i.e., large language models) forever changing (and at an ever-astonishing pace) the nature of human-AI teams, realized in this era of the AI “copilot.” Similarly, robotics and automation technologies now afford greater opportunities to work with and/or near humans, allowing for increasingly collaborative physical robots to dramatically impact real-world activities. It is the compounding effect of enabling all three capabilities, each complementary to one another in valuable ways, and we envision the triad formed by human-robot-AI teams as revolutionizing the future of society, the economy, and of technology.

[ IROS 2025 ]

This GRASP SFI talk is by Chris Paxton at Agility Robotics, on “How Close Are We To Generalist Humanoid Robots?”

With billions of dollars of funding pouring into robotics, general-purpose humanoid robots seem closer than ever. And certainly it feels like the pace of robotics is faster than ever, with multiple companies beginning large-scale deployments of humanoid robots. In this talk, I’ll go over the challenges still facing scaling robot learning, looking at insights from a year of discussions with researchers all over the world.

[ University of Pennsylvania GRASP Laboratory ]

This week’s CMU RI Seminar is from Jitendra Malik at UC Berkeley, on “Robot Learning, With Inspiration From Child Development.”

For intelligent robots to become ubiquitous, we need to “solve” locomotion, navigation and manipulation at sufficient reliability in widely varying environments. In locomotion, we now have demonstrations of humanoid walking in a variety of challenging environments. In navigation, we pursued the task of “Go to Any Thing” – a robot, on entering a newly rented Airbnb, should be able to find objects such as TV sets or potted plants. RL in simulation and sim-to-real have been workhorse technologies for us, assisted by a few technical innovations. I will sketch promising directions for future work.

[ Carnegie Mellon University Robotics Institute ]

Reference: https://ift.tt/OrnbfsG

New AirSnitch attack breaks Wi-Fi encryption in homes, offices, and enterprises

It’s hard to overstate the role that Wi-Fi plays in virtually every facet of life. The organization that shepherds the wireless protocol s...