Thursday, February 26, 2026

New AirSnitch attack breaks Wi-Fi encryption in homes, offices, and enterprises


It’s hard to overstate the role that Wi-Fi plays in virtually every facet of life. The organization that shepherds the wireless protocol says that more than 48 billion Wi-Fi-enabled devices have shipped since it debuted in the late 1990s. One estimate pegs the number of individual users at 6 billion, roughly 70 percent of the world’s population.

Despite the dependence and the immeasurable amount of sensitive data flowing through Wi-Fi transmissions, the history of the protocol has been littered with security landmines stemming both from the inherited confidentiality weaknesses of its networking predecessor, Ethernet (it was once possible for anyone on a network to read and modify the traffic sent to anyone else), and the ability for anyone nearby to receive the radio signals Wi-Fi relies on.

Ghost in the machine

In the early days, public Wi-Fi networks often resembled the Wild West, where ARP spoofing attacks that allowed renegade users to read other users' traffic were common. The solution was to build cryptographic protections that prevented nearby parties—whether an authorized user on the network or someone near the AP (access point)—from reading or tampering with the traffic of any other user.

Read full article

Comments

Reference : https://ift.tt/EQ4bitB

How Stupid Would It Be to Put Data Centers in Space?




What’s the difference between a stupid idea and a brilliant one? Sometimes, it just comes down to resources. Practically unlimited funds, like limitless thrust, can get even a mad idea off the ground.

And so it might be for the concept of putting AI data centers in orbit. In a rare moment of unalloyed agreement, some of the richest and most powerful men in technology are staunchly backing the idea. The group includes Elon Musk, Jeff Bezos, Jensen Huang, Sam Altman, and Google CEO Sundar Pichai. In all likelihood, hundreds of people are now working on the concept of space data centers at the firms directly or indirectly controlled by these men—SpaceX, Starlink, Tesla, Amazon, Blue Origin, Nvidia, OpenAI, and Google, among others.

Pie charts compare the costs of orbital solar\u2014$51.1billion\u2014vs. terrestrial data center\u2014$16 billion. Likely costs to design, build, and launch a 1-GW orbital datacenter, based on a network of some 4,400 satellites and including operating costs over a five-year period, would exceed US $50 billion. That’s about three times the cost of a 1-GW data center on Earth, including five years of operation.John MacNeill

So how much would it cost to start training large language models in space? Probably the best accounting is one created by aerospace engineer Andrew McCalip. McCalip’s exhaustive, detailed analysis includes interactive sliders that let you compare costs for space-based and terrestrial data centers in the range of 1 to 100 gigawatts. One-gigawatt data centers are being built now on terra firma, and Meta has announced plans for a 5-GW facility, with anticipated completion some time after 2030.

In an interview, McCalip says his initial rough calculations a few years ago suggested that data centers in space would cost in the range of 7 to 10 times more, per gigawatt of capacity, than their terrestrial counterparts. “It just wasn’t practical,” he says. “Not even close.” But when Elon Musk began publicly backing the idea, McCalip revisited the numbers using publicly available information about Starlink’s and Tesla’s technologies and capabilities.

That changed the picture substantially. The figures in his online analysis assume an orbital network of data-center satellites that borrows heavily from Musk’s tech treasure chest—“essentially…you just start putting some radiation-resistant ASIC chips on the Starlink fleet and you start growing edge capacity organically on the Starlink fleet,” McCalip says. The network would rely on the kind of watt-efficient GPU architecture used in Teslas for self-driving, he adds. “You start dropping those onto the backs of Starlinks. You can slowly grow this out, and this would be approximately the performance that you would get.”

Bottom line, with some solid but not necessarily heroic engineering, the cost of an orbital data center could be as low as three times that of the comparable terrestrial one. That differential, while still high, at least nudges the concept out of the instantly dismissible category. “I have my particular views, but I want the data to speak for itself,” McCalip says.

For this illustration, we picked a configuration with an aggregate 1 GW of capacity. The network would consist of some 4,300 satellites, each of which would be outfitted with a 1-square-kilometer solar array that generates 250 kilowatts. The data center on that satellite, powered by the array, might have at least 175 GPUs; McCalip notes that a popular GPU rack, Nvidia’s NVL72, has 72 GPUs and requires 120 to 140 kW.

The total cost of the satellite network would be around US $51 billion, including launch and five years of operational expenses; a comparable terrestrial system would cost about $16 billion over the same period.

Stupid? Not stupid? You decide.

Reference: https://ift.tt/ZjvlXdM

Achieving Micron-Level Tolerances: CAD Optimization for Sub-10µm 3D Printing




Achieve successful micro-scale 3D prints by optimizing tolerances, wall thickness, support strategies, microfluidic channels, and material selection in your CAD models from the start.

What Attendees will Learn

  1. Tolerance-driven design -- How to define resolution and tolerance constraints that translate directly from CAD intent to sub-10µm printed geometry.
  2. Geometry-aware fabrication -- Principles for engineering wall thickness, aspect ratios, and orientation to maintain structural fidelity at micron scale.
  3. Support-free design strategies -- Leveraging self-supporting geometries and build orientation to preserve feature integrity without post-processing trade-offs.
  4. Integrated material-process thinking -- Matching resin properties, shrinkage behavior, and export parameters to your application’s functional requirements.
Reference: https://ift.tt/XzeSbFI

Wednesday, February 25, 2026

How to Thrive as a Remote Worker




This article is crossposted from IEEE Spectrum’s careers newsletter. Sign up now to get insider tips, expert advice, and practical strategies, written in partnership with tech career development company Parsity and delivered to your inbox for free!

Standing Out as a Remote Worker Takes a Different Strategy

My first experience as a remote worker was a disaster.

Before I joined a San Francisco-based team with a lead developer in Connecticut, I had worked in person, five days a week. I thought success was simple: write good code, solve hard problems, deliver results. So I put my head down and worked harder than ever.

Twelve-hour days became normal as the boundary between work and personal life disappeared. My kitchen table became my office.

I rarely asked for help because I didn’t want to seem incompetent. I stayed quiet in team Slack channels because I wasn’t sure what to say.

Despite working some of the longest hours of my career, I made the slowest progress. I felt disconnected from the team. I had no idea if my work mattered or if anyone noticed what I was doing. I was burning out.

Eventually, I realized the real problem: I was invisible.

The Office Advantage You Lose When Remote

In an office, visibility happens naturally. Colleagues see you arrive early or stay late. They notice when you are stuck on a problem. They hear about your work in hallway conversations and over lunch. Physical presence creates recognition with almost no effort.

Remote work removes those signals. Your manager cannot see you at your desk. Your teammates don’t know you’ve hit a roadblock unless you say so. You can work long days and still appear less engaged than someone in the office.

That is the shift many people miss: Remote work requires execution plus deliberate communication.

What Actually Works

By my second remote role, I knew I had to change to protect my sanity and still succeed.

Here are five things I did that made a real difference.

1. Over-communicating

I began sharing updates in team channels regularly, not just when asked. “Working on the payment integration today; ready for review tomorrow.” “Hit a blocker with API rate limits; investigating options.” These took seconds but made my work visible and invited help sooner.

2. Setting limits

When your home is also your office, overwork becomes the default. I started ending most days at 5 p.m. and transitioning out of work mode with a walk or gym session. That ritual helped prevent burnout.

3. Volunteering for presentations

Presenting remotely felt less intimidating than standing in front of a room. I started volunteering for demos and lunch-and-learns. This increased my visibility beyond my immediate team and improved my communication skills.

4. Promoting others publicly

When someone helped me, I thanked them in a public channel. When a teammate shipped something impressive, I called it out. This builds goodwill and signals collaboration. In remote environments, gratitude is visible and memorable.

5. Building relationships deliberately

In an office, relationships form naturally. Remotely, you have to create those moments. I started an engineering book club that met every other week to discuss a technical book. It became a low-pressure way to connect with people across the organization.

The Counterintuitive Reality

With these habits, I got promoted faster in this remote job than I ever did in an office. I moved from senior engineer to engineering manager in under two years, while maintaining a better work-life balance.

Remote work offers flexibility and freedom, but it comes with a tax. You are easier to overlook and more likely to burn out unless you are intentional in your actions.

So, succeeding remotely takes deliberate effort in communication, relationships, and boundaries. If you do that well, remote work can unlock more opportunities than you might expect.

—Brian

This Former Physicist Helps Keep the Internet Secure

Despite its critical role in maintaining a secure network, authentication software often goes unnoticed by users. Alan DeKok now runs one of the most widely used remote authentication servers in the world—but he didn’t initially set out to work in cybersecurity. DeKok studied nuclear physics before starting the side project that eventually turned into a three-decade-long career.

Read more here.

More Than 30,000 Tech Employees Laid Off in 2026

We’re just two months into 2026, and layoffs in the tech industry are already ramping up. According to data compiled by RationalFX, more than half of the 30,700 layoffs this year have come from Amazon, which announced that it would be cutting the roles of 16,000 employees in late January. Will the trend continue through 2026?

Read more here.

IEEE Online Mini-MBA Aims to Fill Leadership Skills Gaps in AI

Recent research suggests that a majority of organizations have a significant gap when it comes to AI skills among leadership. To help fill the gap, IEEE has partnered with the Rutgers Business School to offer an online “mini-MBA” program, combining business strategy and deep AI literacy. The program spans 12 weeks and 10 modules that teach students how to implement AI strategies in their own organizations.

Read more here.

Reference: https://ift.tt/sSoWY9k

Jimi Hendrix Was a Systems Engineer




3 February 1967 is a day that belongs in the annals of music history. It’s the day that Jimi Hendrix entered London’s Olympic Studios to record a song using a new component. The song was “Purple Haze,” and the component was the Octavia guitar pedal, created for Hendrix by sound engineer Roger Mayer. The pedal was a key element of a complex chain of analog elements responsible for the final sound, including the acoustics of the studio room itself. When they sent the tapes for remastering in the United States, the sounds on it were so novel that they included an accompanying note explaining that the distortion at the end was not malfunction but intention. A few months later, Hendrix would deliver his legendary electric guitar performance at the Monterey International Pop Festival.

“Purple Haze” firmly established that an electric guitar can be used not just as a stringed instrument with built-in pickups for convenient sound amplification, but also as a full-blown wave synthesizer whose output can be manipulated at will. Modern guitarists can reproduce Hendrix’s chain using separate plug-ins in digital audio workstation software, but the magic often disappears when everything is buffered and quantized. I wanted to find out if a more systematic approach could do a better job and provide insights into how Hendrix created his groundbreaking sound.

My fascination with Hendrix’s Olympic Studios’ performance arose because there is a “Hendrix was an alien” narrative surrounding his musical innovation—that his music appeared more or less out of nowhere. I wanted to replace that narrative with an engineering-driven account that’s inspectable and reproducible—plots, models, and a signal chain from the guitar through the pedals that you can probe stage by stage.

Four plots showing magnitudes plotted against time and frequency. Each effects pedal in Hendrix’s chain contributed to enhancing the electric guitar beyond its intrinsic limits. A selection of plots from the full-circuit analysis shows how the Fuzz Face turns a sinusoid signal from a string into an almost square wave; how the Octavia pedal inverts half the input waveform to double its frequency; how the wah-wah pedal acts as band-pass filter; and how the Uni-Vibe pedal introduces selective phase shifts to color the sound.James Provost/ Rohan S. Puranik

Although I work mostly in the digital domain as an edge-computing architect in my day job, I knew that analog circuit simulations would be the key to going deeper.

My first step was to look at the challenges Hendrix was trying to address. Before the 1930s, guitars were too quiet for large ensembles. Electromagnetic pickups—coils of wire wrapped around magnets that detect the vibrations of metal strings—fixed the loudness problem. But they left a new one: the envelope, which specifies how the amplitude of a note varies as it’s played on an instrument, starting with a rising initial attack, followed by a falling decay, and then any sustain of the note after that. Electric guitars attack hard, decay fast, and don’t sustain like bowed strings or organs. Early manufacturers tried to modify the electric guitar’s characteristics by using hollow bodies fitted with magnetic pickups, but the instrument still barked more than it sang.

Hendrix’s mission was to reshape both the electric guitar’s envelope and its tone until it could feel like a human voice. He tackled the guitar’s constraints by augmenting it. His solution was essentially a modular analog signal chain driven not by knobs but by hands, feet, gain staging, and physical movement in a feedback field.

Hendrix’s setups are well documented: Set lists, studio logs, and interviews with Mayer and Eddie Kramer, then the lead engineer at Olympic Studios, fill in the details. The signal chain for “Purple Haze” consisted of a set of pedals—a Fuzz Face, the Octavia, and a wah-wah—plus a Marshall 100-watt amplifier stack, with the guitar and room acoustics closing a feedback loop that Hendrix tuned with his own body. Later, Hendrix would also incorporate a Uni-Vibe pedal for many of his tracks. All the pedals were commercial models except for the Octavia, which Mayer built to produce a distorted signal an octave higher than its input.

Hendrix didn’t speak in decibels and ohm values, but he collaborated with engineers who did.

I obtained the schematics for each of these elements and their accepted parameter ranges, and converted them into netlists that ngspice can process (ngpsice is an open source implementation of the Spice circuit analyzer). The Fuzz Face pedal came in two variants, using germanium or silicon transistors, so I created models for both. In my models, Hendrix’s guitar pickups had a resistance of 6 kiloohms and an inductance of 2.5 henrys with a realistic cable capacitance.

I chained the circuit simulations together using a script, and I produced data-plot and sample sound outputs with Python scripts. All of the ngspice files and other scripts are available in my GitHub repository at github.com/nahorov/Hendrix-Systems-Lab, with instructions on how to reproduce my simulations.

What Does The Analysis of Hendrix’s Signal Chain Tell Us?

Plotting the signal at different points in the chain with different parameters reveals how Hendrix configured and manipulated the nonlinear complexities of the system as a whole to reach his expressive goals.

A few highlights: First, the Fuzz Face is a two-transistor feedback amplifier that turns a gentle sinusoid signal into an almost binary “fuzzy” output. The interesting behavior emerges when the guitar’s volume is reduced. Because the pedal’s input impedance is very low (about 20 kΩ), the pickups interact directly with the pedal circuit. Reducing amplitude restores a sinusoidal shape—producing the famous “cleanup effect” that was a hallmark of Hendrix’s sound, where the fuzz drops in and out as desired while he played.

A photograph of three young men beside a recording studio mixing desk. The Jimi Hendrix Experience, (left to right) Mitch Mitchel, Jimi Hendrix, Noel ReddingFred W. McDarrah/Getty Images

Second, the Octavio pedal used a rectifier, which normally converts alternating to direct current. Mayer realized that a rectifier effectively flips each trough of a waveform into a peak, doubling the number of peaks per second. The result is an apparent doubling of frequency—a bloom of second-harmonic content that the ear hears a bright octave above the fundamental.

Third, the wah-wah pedal is a band-pass filter: Frequency plots show the center frequency sweeping from roughly 300 hertz to 2 kilohertz. Hendrix used it to make the guitar “talk” with vowel sounds, most iconically on “Voodoo Child (Slight Return).”

Fourth, the Uni-Vibe cascades four phase-shift sections controlled by photoresistors. In circuit terms, it’s a low-frequency oscillator modulating a variable-phase network; in musical terms it’s motion and air.

Finally, the whole chain became a closed loop by driving the Marshall amplifier near saturation, which among other things extends the sustain. In a reflective room, the guitar strings couple acoustically to the speakers—move a few centimeters and you shift from one stable feedback mode to another. To an engineer, this is a gain-controlled acoustic feedback system. To Hendrix, it was part of the instrument. He learned to tune oscillation with distance and angle, shaping sirens, bombs, and harmonics by walking the edge of instability.

Hendrix didn’t speak in decibels and ohm values, but he collaborated with engineers who did—Mayer and Kramer—and iterated fast as a systems engineer. Reframing Hendrix as an engineer doesn’t diminish the art. It explains how one person, in under four years as a bandleader, could pull the electric guitar toward its full potential by systematically augmenting the instrument’s shortcomings for maximum expression.

This article appears in the March 2026 print issue as “Jimi Hendrix, Systems Engineer.”

Reference: https://ift.tt/76Im1JV

Tuesday, February 24, 2026

This Physics Professor Credits Collaboration for Her Success




For Cinzia DaVià, collaboration isn’t just a buzzword. It’s the approach she applies to all her professional endeavors.

From her contributions to the development of a silicon sensor used in CERN (European Organization for Nuclear Research) particle accelerator experiments to her current research on portable energy generation solutions, there’s a common thread.

Cinzia DaVià


Employers

University of Manchester, England;

Stony Brook University, in New York

Job titles

Professor of physics; research professor

Member grade

Senior member

Alma maters

University of Bologna, Italy; University of Glasgow

As a professor of physics at the University of Manchester, in England, and a research professor at Stony Brook University, in New York, she has built strong connections across academic disciplines. Her continued involvement at CERN connects her with a broad array of professionals.

DaVià, an IEEE senior member, says she leverages her expertise and her network of collaborators to solve problems and build solutions. Her efforts include advancing high-energy particle experiments, improving cancer treatments, and mitigating the effects of climate change.

Collaboration is the foundation for any project’s success, she says. She credits IEEE for making many of her professional connections possible.

Even though she is the driving force behind building her alliances, she prefers to shine the spotlight on others, she says. For her, focusing on teamwork is more important than identifying individual contributions.

“The people involved in any project are really the ones to be celebrated,” she says. “The focus should be on them, not me.”

A career influenced by Italian television

As a young child growing up in the Italian Dolomites, her passion for physics was sparked by a popular documentary series, “Astronomia,” an Italian version of Carl Sagan’s renowned “Cosmos” series. The show was DaVià’s introduction to the world of astrophysics. She enrolled at Italy’s Alma Mater Studiorum/University of Bologna, confident she would pursue a degree in astronomy and astrophysics.

A summer internship at CERN in Geneva changed her career trajectory. She helped construct experiments for the Large Electron-Positron collider there. The LEP remains the largest electron-positron accelerator ever. An underground tunnel wide enough to accommodate the LEP’s 27-kilometer circumference was built on the CERN campus. It was Europe’s biggest civil engineering project at the time.

The LEP was designed to validate the standard model of physics, which until then was a theoretical framework that attempted to explain the universe’s building blocks. The experiments—which performed precision measurements of W and Z bosons, the positive and neutral bits central to particle physics—confirmed the standard model.

The LEP also paved the way, figuratively and literally, for CERN’s Large Hadron Collider. Following the LEP’s decommissioning in 2000, it was dismantled to make way for the LHC in the same underground testing tunnel.

As DaVià’s summer internship work on LEP experiments progressed, her professional focus shifted. Her plans to work in astrophysics gradually transitioned to a focus on radiation instrumentation.

After graduating in 1989 with a physics degree, she returned to CERN for a one-year assignment. As she got more involved in research and development for the large collider experiments, her one year turned into 10.

She received a CERN fellowship to help her finish her Ph.D. in physics at the University of Glasgow—which she received in 1997. Her work focused on radiation detectors and their applications in medicine.

“Nothing was programmed,” she says of her career trajectory. “It was always an opportunity that came after another opportunity, and things evolved along the way.”

A fusion of research and results

During her decade at CERN from 1989 to 1999, she contributed to several groundbreaking discoveries. One involved the radiation hardness of silicon sensors at cryogenic temperatures, referred to in physics as the Lazarus effect.

In the world of collider experiments, the silicon sensors function as eyes that capture the first moments of particle creation. The sensors are part of a larger detector unit that takes millions of images per second, helping scientists better understand particle creation.

In large collider experiments, the silicon sensors suffer significant damage from the radiation generated. After repeated exposure, the sensors eventually become nonfunctional.

DaVià’s contributions helped develop the process of reviving the dead detectors by cooling them down to temperatures below -143° C.

Her proudest professional accomplishment, she says, was a different discovery at CERN: Her research helped usher in a new era of large collider experiments.

For many years, researchers there used planar silicon sensors in collider experiments. But as the large colliders grew more sophisticated and capable, the traditional planar silicon design couldn’t withstand the extreme radiation present at the epicenter of collider collisions.

DaVià’s research contributed to the development, together with inventor Sherwood Parker, of 3D silicon sensors that could withstand extreme radiation.

The new sensors are radiation-resistant and exceptionally fast, she says.

Scientists began replacing planar sensors in the detectors deployed closest to the center of each collision. Planar detectors are still widely used in collider experiments but farther from direct impacts.

The development of the 3D silicon sensor was groundbreaking, but DaVià says she is proud of it for a different reason. The collaborative approach of the cross-functional R&D team she built is the most noteworthy outcome, she says.

Initially, people with conservative scientific views resisted the idea of creating a new sensor technology, she says. She was able to bring together a broad coalition of scientists, researchers, and industry leaders to work together, despite the initial skepticism and competing interests. The team included two companies that were direct competitors.

That type of industry collaboration was unheard of at the time, she says.

“I was able to convince them,” she says, “that working together would be the best and fastest way forward.”

Her approach succeeded. The two companies not only worked side by side but also exchanged proprietary information. They went so far as to agree that if something halted progress for one of them, it would ship everything to the other so production could continue.

DaVià coauthored a book about the project, Radiation Sensors With 3D Electrodes.

A focus on sustainable entrepreneurship

DaVià has long been concerned about the impact of extreme weather events, especially on underserved populations. Her interest transformed into action after she attended the American Institute of Architects International and AIA Japan Osaka World Expo last year.

During the symposium, held in June, panelists shared insights about natural disasters in their regions and identified steps that could help mitigate damage and protect lives.

The topics that particularly interested DaVià, she says, were excessive glacial melt in the Himalayas and the lack of tsunami warnings on remote Indonesian islands.

One of the ideas that surfaced during a brainstorming session was that of “smart shelters” that could be deployed in remote areas to assist in recovery efforts. The shelters would provide power and a means of communication during outages.

The concept was inspired by MOVE, an IEEE-USA initiative. The MOVE program provides communities affected by natural disasters with power and communications capabilities. The services are contained within MOVE vehicles and are powered by generators. A single MOVE vehicle can charge up to 100 phones, bolstering communication capabilities for relief agencies and disaster survivors.

DaVià’s knowledge of MOVE guided the evolution of the smart shelter concept. She recognized, however, that the challenge of powering portable shelters needed to be solved. She took the lead and formed a cross-disciplinary team of IEEE members and other professionals to make headway. One result is a planned two-day conference on sustainable entrepreneurship to be held at CERN in October.

“IEEE helps bring people together who might not otherwise connect.”

The goal of the conference, she says, is to “join the dots across different disciplines by involving as many IEEE societies and external experts as possible to work toward deployable solutions that help improve life for people around the world.”

The two-day event will include a competition focusing on solutions for sustainable energy generation and storage systems, she says, adding that entrepreneurs will share their ideas on the second day.

Her commitment to developing solutions to mitigate destruction caused by extreme weather led to her involvement with the IEEE Online Forum on Climate Change Technologies. She led the way in creating the Climate Change Initiative within the IEEE Nuclear and Plasma Sciences Society (NPSS).

She was the driving force behind securing funding for two of the society’s climate-related events. One was the 2024 Climate Workshop on Nuclear and Plasma Solutions for Energy and Society. The second event, building on the success of the first, was last year’s workshop: Nuclear and Plasma Opportunities for Energy and Society, held in conjunction with the Osaka World Expo.

New paths to guide others

DaVià reduced her involvement at CERN, when she joined the faculty at the University of Manchester as a physics professor. In 2016 she joined Stony Brook University as a research professor in the physics and astronomy department. She divides her time between the two schools.

She still maintains an office at CERN, where she works with students involved with particle physics. She is also an advisory board member of its IdeaSquare, an innovation space where science, technology, and entrepreneurial minds gather to brainstorm and test ideas. The goal is to identify ways to apply innovations generated by high-energy physics experiments to solve global challenges.

DaVià is the radiation detectors and imaging editor of Frontiers in Physics and a cochair of the European Union’s ATTRACT initiative, which promotes radiation imaging research across the continent. She is an active member of the European Physical Society, and she is an IEEE liaison officer for the physics and industry working group of the International Union of Pure and Applied Physics.

She has coauthored more than 900 publications.

IEEE as the connector

DaVià’s involvement with IEEE dates back to her undergraduate years, when she was introduced to the organization at a conference sponsored by the IEEE NPSS.

As her career grew, so did her involvement with IEEE.

She remains active with the society as a distinguished lecturer. She is a member of the IEEE Society of Social Implications of Technology, the IEEE Power & Energy Society, and the IEEE Women in Engineering group. She received the 2022 WIE Outstanding Volunteer of the Year Award.

She stays involved in IEEE to help her understand the work being done within each society and identify opportunities for cross-collaboration, she says. She sees such synergies as a key benefit of membership.

“IEEE helps bring people together who might not otherwise connect,” she says. “We are stronger together with IEEE.”

Reference: https://ift.tt/CNfEik9

Your Watch Will One Day Track Blood Pressure




Your smartwatch can track a lot of things, but at least for now, it can’t keep an accurate eye on your blood pressure. Last week researchers from University of Texas at Austin showed a way you smartwatch someday could. They were able to discern blood pressure by reflecting radio signals off a person’s wrist, and they plan to integrate the electronics that did it into a smartwatch in a couple of years.

Beside the tried-and-true blood pressure cuff, researchers in general have found several new ways to monitor blood pressure using pasted-on ultrasound transducers, electrocardiogram sensors, bioimpedance measurements, photoplethysmography, and combinations of these measurements.

“We found that existing methods all face limitations,” Yiming Han, a doctoral candidate in the lab of Yaoyao Jia told engineers at the IEEE International Solid State Circuits Conference (ISSCC) last week in San Francisco. For example, ultrasound sensing requires long-term contact with the skin. And as cool as electronic tattoos seem, they’re not as convenient or comfortable as a smartwatch. Photoplethysmography, which detects the oxygenation state of blood using light, doesn’t need direct contact, and indeed researchers in Tehran and California recently used it and a heavy dose of machine learning to monitor blood pressure. However, these sensors are thought to be sensitive to a person’s skin tone and were blamed for Black people in the United States getting inadequate treatment during the COVID-19 pandemic.

The University of Texas team sought a non-contact solution that was immune to skin-tone bias and could be integrated into a small device.

Continuous Blood Pressure Monitoring

Blood pressure measurements consist of two readings—systole, the peak pressure when the heart contracts and forces blood into arteries, and diastole, the phase in between heart contractions when pressure drops. During systole, blood vessels expand and stiffen and blood velocity increases. The opposite occurs in diastole.

All these changes alter conductivity, dielectric properties, and other tissue properties, so they should show up in reflected near-field radio waves, Jia’s colleague Deji Akinwande reasoned. Near-field waves are radiation impacting a surface that is less than one wavelength from the radiation’s source.

The researchers were able to test this idea using a common laboratory instrument called a vector network analyzer. Among its abilities, the analyzer can sense RF reflection, and the team was able to quickly correlate the radio response to blood pressure measured using standard medical equipment.

What Akinwande and Jia’s team saw was this: During systole, reflected near-field waves were more strongly out of phase with the transmitted radiation, while in diastole the reflections were weaker and closer to being in phase with the transmission.

You obviously can’t lug around a US $50,000 analyzer just to keep track of your blood pressure, so the team created a wearable system to do the job. It consists of a patch antenna strapped to a person’s wrist. The antenna connects to a device called a circulator—a kind of traffic roundabout for radio signals that steers outgoing signals to the antenna and signals coming in from the antenna to a separate circuit. A custom-designed integrated circuit feeds a 2.4 gigahertz microwave signal into one of the circulator’s on-ramps and receives, amplifies, and digitizes the much weaker reflection coming in from another branch. The whole system consumes just 3.4 milliwatts.

“Our work is the only one to provide no skin contact and no skin-tone bias,” Han said.

The next version of the device will use multiple radio frequencies to increase accuracy, says Jia, “because different people’s tissue conditions are different” and some might respond better to one or another. Like the 2.4 gigahertz used in the prototype these other frequencies will be of the sort already in common use such as 5 GHz (a Wi-Fi frequency) and 915 megahertz (a cellular frequency).

Following those experiments, Jia’s team will turn to building the device into a smartwatch form factor and testing them more broadly for possible commercialization.

Reference: https://ift.tt/WyZHrtP

New AirSnitch attack breaks Wi-Fi encryption in homes, offices, and enterprises

It’s hard to overstate the role that Wi-Fi plays in virtually every facet of life. The organization that shepherds the wireless protocol s...