Connect with us
DAPA Banner

Tech

How to watch Salisbury Poisonings: The Untold Story online from anywhere

Published

on

Three-part documentary Salisbury Poisonings: The Untold Story uncovers the story of how a small UK Cathedral city fell victim to the first confirmed nerve agent attack on British soil since the Second World War. It’s free to watch in the UK (with a TV licence) – but what if you’re abroad? Good news: we have the solution below.

Source link

Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

Tech

Intel says software, not more cache, is key to beating AMD in gaming

Published

on


Speaking to the German media outlet PC Games Hardware about Intel’s plans to compete with AMD’s X3D line of gaming CPUs, Vice President Robert Hallock said that hardcore PC enthusiasts are “significantly underestimating” the importance of software to the PC experience, and that no amount of cores or cache can…
Read Entire Article
Source link

Continue Reading

Tech

EU and Parliament fail to agree on AI Act changes after 12 hours of talks, pushing deal to next month

Published

on

The collapse of Tuesday’s trilogue exposes deep divisions over whether high-risk AI systems embedded in consumer products should be exempt from the world’s strictest AI rules

After 12 hours of negotiations on Tuesday, EU member states and European Parliament lawmakers failed to reach a deal on proposed changes to the bloc’s landmark AI Act. Talks will resume in May, according to Reuters.

“It was not possible to reach an agreement with the European Parliament,” a Cypriot official said, Cyprus currently holding the rotating EU Council presidency.

The failed session was the final scheduled political trilogue on the AI Omnibus, a package of amendments to the AI Act that entered into force in August 2024, as well as proposed changes to the GDPR, the e-Privacy Directive, and the Data Act. The Omnibus is framed as a competitiveness measure, aimed at reducing regulatory burdens on businesses to help European companies keep pace with US and Asian rivals. Its critics, who include a large coalition of privacy and civil rights organisations, argue it is a rollback of hard-won protections dressed up as simplification.

Advertisement

The core unresolved question on Tuesday was whether high-risk AI systems embedded in products already regulated under EU product safety legislation — medical devices, toys, connected cars, industrial machinery — should be exempt from the AI Act’s additional requirements. The European Parliament, backed by industry groups, has been pushing for these systems to be covered by their existing sectoral rules only. The Council, representing member states, has shown limited enthusiasm for such a broad carve-out.

The Omnibus has come under sustained criticism from researchers and civil society organisations who argue that weakening the AI Act before its core provisions have even come into force risks dismantling one of Europe’s most distinctive regulatory assets. Michael McNamara, the Parliament’s lead negotiator on the AI Omnibus, acknowledged in an interview with Tech Policy Press that overlapping rules can be difficult to manage, but warned that shifting AI governance into sectoral laws could ultimately be “deregulatory rather than simplifying.”

Civil society groups have been more direct. Over 40 organisations signed a letter to the Parliament in mid-April arguing that the proposed changes weaken the AI Act’s fundamental rights protections, particularly for biometric identification systems, AI used in schools, and medical AI. The AI Act was widely seen as a global standard-setter when it entered into force.

The urgency behind the negotiations is structural. The AI Act’s core obligations for high-risk AI systems are currently set to apply from August 2, 2026, just three months away. The entire purpose of the AI Omnibus is to postpone that deadline to December 2, 2027, for stand-alone high-risk systems, and to August 2, 2028, for those embedded in regulated products.

For that postponement to take legal effect before the August deadline, a final political agreement, formal Parliament vote, Council endorsement, and publication in the Official Journal must all occur in a matter of weeks.

Advertisement

If talks continue to stall in May and no agreement is reached before June, the original August 2026 deadline will stand. That would mean that companies relying on the Omnibus’s extended timelines would suddenly face immediate compliance obligations for which many have not adequately prepared — a scenario Brussels has been working hard to avoid.

The Omnibus also contains one widely supported measure: a ban on AI systems generating non-consensual intimate images, including child sexual abuse material. This was added to the package following the controversy over Elon Musk’s Grok chatbot’s nudification capabilities in late 2025, both the Parliament and Council had already aligned on it. That the talks collapsed despite this area of consensus underlines how intractable the sectoral exemption question remains.

The resumption of talks next month will determine whether the EU can still claim to be doing this in an orderly way, or whether the world’s most ambitious AI regulation stumbles at the moment its hardest rules are meant to bite.

Advertisement

Source link

Continue Reading

Tech

Apple Vision Pro Used In World-First Cataract Surgery

Published

on

Apple’s Vision Pro has been used in what’s described as the world’s first cataract surgery performed with the headset. MacRumors reports: [New York opthalmologist] Dr. Eric Rosenberg of SightMD completed the initial procedure in October 2025 and has since performed hundreds of additional cases using ScopeXR, a surgical platform he co-developed for Apple’s mixed reality device. ScopeXR streams live feeds from 3D digital surgical microscopes directly into the Vision Pro, which lets the surgeon view the operative field in stereoscopic 3D while overlaying preoperative diagnostic data. The platform also supports real-time remote collaboration, allowing surgeons to virtually join procedures and see exactly what the operating surgeon sees. “We are now able to bring the world’s best surgeon into any operating room, at any hour, from anywhere on the planet,” said Dr. Rosenberg in a company press release. “From residents performing their first cases to surgeons facing unexpected complications, this technology democratizes access to expertise and that will save vision.”

Source link

Continue Reading

Tech

New Report Finds Some Babies Spend Up To Eight Hours a Day on Screens

Published

on

fjo3 shares a report from The Times: More than two-thirds of babies under two use screens, a report has found, and some are exposed for up to eight hours a day. Nearly a third of newborns were found to be watching screens for more than three hours a day, while almost 20 percent of infants of four to 11 months used screens for more than an hour a day. The report comes after the government issued guidance that children under two do not use screens at all, apart from communal activities such as video-calling relatives.

In a review of the current research, researchers found evidence linking screen time to poorer outcomes for children, including an increased risk of obesity, short-sightedness, sleep and behavioural difficulties, and later challenges with friendships. […] The research also revealed why children and parents use screens, with families reporting children doing so for educational purposes, entertainment, play and to communicate and bond with others. Parents, meanwhile, used screens to occupy or distract children, which helped caregivers to complete domestic duties, paid employment and other caring responsibilities. Nearly a quarter of parents — 23.6 percent — either had no childcare or were not aware of the government’s early years offer.

Source link

Continue Reading

Tech

Three Apple TV series land six Gotham Television nominations

Published

on

“Pluribus,” “Margo’s Got Money Troubles,” and “Mr. Scorsese” have all been nominated for Gotham Television Awards, which could contribute to Apple’s growing trophy cabinet.

Apple TV debuted in 2019 with a handful of exclusive, original shows and films. While it took some time for it to gain momentum, it is regularly celebrated across the awards industry.

The latest nominations come from the Gotham Television Awards, which will be held on June 1. Three Apple TV shows received six nominations total.

Pluribus has already brought home awards from the Golden Globes and Peabody Awards, and it has been nominated for three Gotham Television Awards. It has been nominated for Breakthrough Drama Series, Outstanding Lead Performance in a Drama Series from Rhea Seehorn, and Outstanding Supporting Performance in a Drama Series from Karolina Wydra.

Advertisement

Next up is the more recent series Margo’s Got Money Troubles. It has been nominated for Outstanding Lead Performance in a Comedy Series from Elle Fanning and Outstanding Supporting Performance in a Comedy Series from Michelle Pfeiffer.

Documentary Mr. Scorsese rounds out the nominations with one for Breakthrough Nonfiction Series.

Apple shared that it has reached 800 total wins and 3,437 nominations. It previously won Gotham Television Awards for The Studio, Pachinko, and CODA.

Advertisement

Source link

Continue Reading

Tech

JMGO N3 Ultimate 4K Laser Projector Debuts World’s First 3-in-1 Optical System with Up to 5,800 Lumens

Published

on

Projectors are shrinking, shifting, and finally making some sense for real rooms. The surge in so-called “lifestyle” models isn’t about chasing trends; it’s about fixing long-standing headaches like placement, portability, and footprint without gutting performance. JMGO leaned into that shift in 2024 with its N1S lineup, pairing a compact chassis with a genuinely useful integrated stand that doubles as a carry handle. It’s a practical rethink of the category—one that trades ceiling mounts and guesswork for flexibility you can actually use.

jmgo-n3-ultimate-front

The JMGO N3 Ultimate

For 2026, JMGO is pushing further into that territory with the N3 Ultimate, a 4K projector built around what it claims is the world’s first 3-in-1 integrated optical system and rated up to 5,800 ISO lumens. That’s real brightness and image stability in a form factor that doesn’t demand a dedicated theater or a permanent install. Like its predecessor, it’s designed to move easily between rooms or even outdoors while minimizing the usual headaches around placement, angle, and setup.

JMGO says the N3 Ultimate’s 3-in-1 optical system combines key projection elements into a single integrated architecture, reducing optical path complexity while improving efficiency, alignment, and overall image consistency.

Advertisement
  • Four-way lens shift (Vertical ±130%, Horizontal ±53%) 
  • Wide-range optical zoom (0.88–1.7:1) 
  • Precision AI-powered gimbal (Vertical 150°, Horizontal 360°) 

By integrating these elements into a single system, the JMGO N3 Ultimate adapts to different positions and surfaces with far less effort than traditional designs. What used to involve multiple manual adjustments like angle, focus, and keystone is streamlined into a one click setup process that removes most of the friction from getting a clean, properly aligned image.

jmgo-n3-ultimate-projector-rear

Who Needs Keystone Correction?

In real world use, almost every projector ends up off center. Unlike many models that rely on digital keystone correction to fix placement and screen geometry which can reduce resolution and brightness, the JMGO N3 Ultimate leans into optical adjustment first, preserving more of the original image quality while still accommodating less than perfect placement.

Optical Image Optimization: The features allow images to be aligned instantly while maintaining clarity, detail, and brightness, even from off-center positions. 

AI Spatial Image Memory:  This feature remembers your space using AI Spatial Image Memory. This means that the JMGO N3 Ultimate allows users to instantly switch projection between different walls and screens without touching the projector. From the front wall to the ceiling or surrounding walls, the N3 Ultimate stores preferred screen sizes, optimized image settings, and the corresponding app for each scenario. 

The Single Click: With a single click, the N3 Ultimate restores the optimal setup and launches the relevant app – making setup and viewing hassle-free. 

Advertisement

Great Color, High Brightness, and 4K UHD Resolution

jmgo-triple-laser-engine

To support true spatial freedom across different environments, the N3 Ultimate delivers up to 5,800 ISO lumens, powered by JMGO’s MALC (Microstructure Adaptive Laser Control) 5.0 triple laser system, ensuring bright, vivid visuals even during daytime viewing. Key features of JMGO’s MALC include: Triple Laser, Reduced Speckling, Improved Uniformity, and a Compact Design. 

With a color accuracy of ΔE ≈ 0.7 and 110% BT.2020 color gamut coverage, the N3 Ultimate is designed to support the reproduction of native colors, further enhanced by 4K Ultra HD resolution (achieved via pixel shifting), Dolby Vision, and Dolby Audio for an immersive viewing experience. 

Advertisement. Scroll to continue reading.

Great for Movies, Sports, and Gaming

The N3 Ultimate incorporates the MT9679 SoC and 8445 Driver (laser controller), and provides support for  Variable Refresh Rate (VRR), 1ms ultra-low latency, and up to 240Hz refresh rate, delivering smooth, responsive visuals across movies, sports, and gaming, further enhanced by upgraded MEMC motion performance. 

Wireless and Wired Connectivity

The JMGO N3 Ultimate includes two-way Bluetooth 5.2 connectivity. You can use the projector as a Bluetooth speaker or connect it to external Bluetooth speakers/headphones. A Bluetooth remote with voice capabilities is also included.

Advertisement

Wi-Fi is provided for easy access to streaming content (such as Licensed Netflix) as well as wired content via HDMI (with eARC) and USB 3.0. 

Comparison

jmgo-n3-n1s-ultimate
JMGO Model  N3 Ultimate (2026) N1S Ultimate (2024)
Product Type 4K UHD Laser Projector 4K UHD Laser Projector
MSRP $2,999 $2,199
Light Source  Tri-Color Laser  Tri-Color Laser 
Display Technology DLP DLP
DMD Size  0.47  0.47 
Resolution 4K UHD 4K UHD
Processing Chip  MT9679  MT9629
RAM 4GB 2GB
ROM 64GB 32GB
Screen Size 40″-300″ (Optimized for 120-150 inch screens, even in moderately lit environments) 40″-300″ (Optimized for 60-150 inch screens, even in moderately lit environments)
Throw Ratio  Optical Zoom 0.88-1.7: 1  1.2:1
Lens Shift  Vertical 130%
Horizontal 53% 
Not Supported
Not Supported
Gimbal  AI Gimbal AI Gimbal
Storage  4+64GB  32 GB
Gaming  VRR (Variable Refresh Rate)
Low Latency
Pro Game Mode 
Not Supported
Low Latency 
Game Mode
HDR  Dolby Vision, HDR10  HDR10 
Brightness (Lumens) 5800 ISO  3300 ISO 
Color Gamut  110% BT2020  110% BT2020 
Color Accuracy  Delta E ≈0.7  Not Indicated
Contrast Ratio (FOFO Contrast)  20,000: 1  1,600:1
Shadow Detail Enhancement  Supported Not Indicated
AI Dynamic Black  Supported Not Indicated
Space-Freedom Solution  Optical Image Optimization
AI Spatial Memory 
Not Indicated
Not Indicated
Real-Time AI Adjustment  Auto-Screen Fitting

Shift-Auto Screen Fitting

Auto Keystone Correction

Auto Focus

Advertisement

Auto Obstacle Avoidance

Auto Eyes Protection

Wall Color Adaptation 

Instant Auto Keystone
Advertisement

Instant Auto Focus

Smart Eye Protection

Noise Level <26dB <26dB
OS Google TV with Native Netflix Google TV with Native Netflix
Connectivity 2x HMDI 2.1 ( one supports eARC ), USB 3.0 2x HMDI 2.1 ( one supports eARC ), 1x USB-A 2.0
Audio Power Output 2 x 12.5W (25W) Powerful audio with a 20W subwoofer, 45Hz deep bass. 
Audio Format Support Dolby Audio, DTS:X  Dolby Digital Plus, DTS-HD Master Audio 
Dimensions 308.3 x 229.85 x 274.13 mm

12.14 x 9.1 x 10.8 inches

Advertisement
243 x 210 x 238 mm

9.57 x 8.3 x 9.37 inches

Weight 6.95 kg / 15.3 lbs 4.5 kg / 10 lbs
jmgo-n3-ultimate-connections

The Bottom Line 

The JMGO N3 Ultimate builds on the earlier N1S line with a similar lifestyle friendly design but raises the bar with up to 5800 ISO lumens and more refined AI assisted setup tools. The goal is simple: reduce reliance on keystone correction and minimize the need for lens shift by getting the image right through smarter positioning and optical control.

On the audio side, support for Dolby Audio and DTS:X is included, but expectations should be realistic. For a proper home theater experience, you will still want to connect it to an external soundbar or AVR over HDMI eARC. It also supports wireless Bluetooth headphones and can double as a standalone Bluetooth speaker when you are not watching anything, which adds some flexibility beyond movie night.

At $3000, it is not inexpensive for a compact projector, but the combination of high brightness, simplified setup, and portability makes it a viable option for apartment dwellers, flexible living spaces, and even outdoor viewing. Just do not assume it has the category to itself. Brands like XGIMI, Hisense, and Dangbei are all circling the same space with aggressive alternatives that deserve a look before you commit.

Advertisement

Pricing & Availability 

Shoppers can save 20% off regular pricing with early-bird savings until May 13, 2026.

Source link

Advertisement
Continue Reading

Tech

The FPGA Chip Is an IEEE Milestone

Published

on

Many of the world’s most advanced electronic systems—including Internet routers, wireless base stations, medical imaging scanners, and some artificial intelligence tools—depend on field-programmable gate arrays. Computer chips with internal hardware circuits, the FPGAs can be reconfigured after manufacturing.

On 12 March, an IEEE Milestone plaque recognizing the first FPGA was dedicated at the Advanced Micro Devices campus in San Jose, Calif., the former Xilinx headquarters and the birthplace of the technology.

The FPGA earned the Milestone designation because it introduced iteration to semiconductor design. Engineers could redesign hardware repeatedly without fabricating a new chip, dramatically reducing development risk and enabling faster innovation at a time when semiconductor costs were rising rapidly.

The ceremony, which was organized by the IEEE Santa Clara Valley Section, brought together professionals from across the semiconductor industry and IEEE leadership. Speakers at the event included Stephen Trimberger, an IEEE and ACM Fellowwhose technical contributions helped shape modern FPGA architecture. Trimberger reflected on how the invention enabled software-programmable hardware.

Advertisement

Solving computing’s flexibility-performance tradeoff

FPGAs emerged in the 1980s to address a core limitation in computing. A microprocessor executes software instructions sequentially, making it flexible but sometimes too slow for workloads requiring many operations at once.

At the other extreme, application-specific integrated circuits are chips designed to do only one task. ASICs achieve high efficiency but require lengthy development cycles and nonrecurring engineering costs, which are large, upfront investments. Expenses include designing the chip and preparing it for manufacturing—a process that involves creating detailed layouts, building masks for the fabrication machines, and setting up production lines to handle the tiny circuits.

“ASICs can deliver the best performance, but the development cycle is long and the nonrecurring engineering cost can be very high,” says Jason Cong, an IEEE Fellow and professor of computer science at the University of California, Los Angeles. “FPGAs provide a sweet spot between processors and custom silicon.”

Cong’s foundational work in FPGA design automation and high-level synthesis transformed how reconfigurable systems are programmed. He developed synthesis tools that translate C/C++ into hardware designs, for example.

Advertisement

At the heart of his work is an underlying principle first espoused by electrical engineer Ross Freeman: By configuring hardware using programmable memory embedded inside the chip, FPGAs combine hardware-level speed with the adaptability traditionally associated with software.

The FPGA architecture originated in the mid-1980s at Xilinx, a Silicon Valley company founded in 1984. The invention is widely credited to Freeman, a Xilinx cofounder and the startup’s CTO. He envisioned a chip with circuitry that could be configured after fabrication rather than fixed permanently during creation.

Articles about the history of the FPGA emphasize that he saw it as a deliberate break from conventional chip design.

At the time, semiconductor engineers treated transistors as scarce resources. Custom chips were carefully optimized so that nearly every transistor served a specific purpose.

Advertisement

Freeman proposed a different approach. He figured Moore’s Law would soon change chip economics. The principle holds that transistor counts roughly double every two years, making computing cheaper and more powerful. Freeman posited that as transistors became abundant, flexibility would matter more than perfect efficiency.

He envisioned a device composed of programmable logic blocks connected through configurable routing—a chip filled with what he described as “open gates,” ready to be defined by users after manufacturing. Instead of fixing hardware in silicon permanently, engineers could configure and reconfigure circuits as requirements evolved.

Freeman sometimes compared the concept to a blank cassette tape: Manufacturers would supply the medium, while engineers determined its function. The analogy captured a profound shift in who controls the technology, shifting hardware design flexibility from chip fabrication facilities to the system designers themselves.

In 1985 Xilinx introduced the first FPGA for commercial sale: the XC2064. The device contained 64 configurable logic blocks—small digital circuits capable of performing logical operations—arranged in an 8-by-8 grid. Programmable routing channels allowed engineers to define how signals moved between blocks, effectively wiring a custom circuit with software.

Advertisement

Fabricated using a 2-micrometer process (meaning that 2 µm was the minimum size of the features that could be patterned onto silicon using photolithography), the XC2064 implemented a few thousand logic gates. Modern FPGAs can contain hundreds of millions of gates, enabling vastly more complex designs. Yet the XC2064 established a design workflow still used today: Engineers describe the hardware behavior digitally and then “compile the design,” a process that automatically translates the plans into the instructions the FPGA needs to set its logic blocks and wiring, according to AMD. Engineers then load that configuration onto the chip.

The breakthrough: hardware defined by memory

Earlier programmable logic devices, such as erasable programmable read-only memory, or EPROM, allowed limited customization but relied on largely fixed wiring structures that did not scale well as circuits grew more complex, Cong says.

FPGAs introduced programmable interconnects—networks of electronic switches controlled by memory cells distributed across the chip. When powered on, the device loads a bitstream configuration file that determines how its internal circuits behave.

“As process technology improved and transistor counts increased, the cost of programmability became much less significant,” Cong says.

Advertisement

From “glue logic” to essential infrastructure

“Initially, FPGAs were used as what engineers called glue logic,” Cong says.

Glue logic refers to simple circuits that connect processors, memory, and peripheral devices so the system works reliably, according to PC Magazine. In other words, it “glues” different components together, especially when interfaces change frequently.

Early adopters recognized the advantage of hardware that could adapt as standards evolved. In “The History, Status, and Future of FPGAs,” published in Communications of the ACM, engineers at Xilinx and organizations such as Bell Labs, Fairchild Semiconductor, IBM, and Sun Microsystems said the earliest uses of FPGAs were for prototyping ASICs. They also used it for validating complex systems by running their software before fabrication, allowing the companies to deploy specialized products manufactured in modest volumes.

Those uses revealed a broader shift: Hardware no longer needed to remain fixed once deployed.

Advertisement

A group dressed in business casual attire smiling and posing together around an outdoor bench adorned with a plaque.Attendees at the Milestone plaque dedication ceremony included (seated L to R) 2025 IEEE President Kathleen Kramer, 2024 IEEE President Tom Coughlin, and Santa Clara Valley Section Milestones Chair Brian Berg.Douglas Peck/AMD

Semiconductor economics changed the equation

The rise of FPGAs closely followed changes in semiconductor economics, Cong says.

Developing a custom chip requires a large upfront investment before production begins. As fabrication costs increased, products had to ship in large quantities to make ASIC development economically viable, according to a post published by AnySilicon.

FPGAs allowed designers to move forward without that larger monetary commitment.

ASIC development typically requires 18 to 24 months from conception to silicon, while FPGA implementations often can be completed within three to six months using modern design tools, Cong says. The shorter cycle and the ability to reconfigure the hardware enabled startups, universities, and equipment manufacturers to experiment with advanced architectures that were previously accessible mainly to large chip companies.

Advertisement

Lookup tables and the rise of reconfigurable computing

A popular technique for implementing mathematical functions in hardware isthe lookup table (LUT). A LUT is a small memory element that stores the results of logical operations, according to “LUT-LLM: Efficient Large Language Model Inference with Memory-based Computations on FPGAs,” a paper selected for presentation next month at the 34th IEEE International Symposium on Field-Programmable Custom Computing Machines (FCCM).

Instead of repeatedly recalculating outcomes, the chip retrieves answers directly from memory. Cong compares the approach to consulting multiplication tables rather than recomputing the arithmetic each time.

Research led by Cong and others helped develop efficient methods for mapping digital circuits onto LUT-based architectures, shaping routing and layout strategies used in modern devices.

As transistor budgets expanded, FPGA vendors integrated memory blocks, digital signal-processing units, high-speed communication interfaces, cryptographic engines, and embedded processors, transforming the devices into versatile computing platforms.

Advertisement

Why the gate arrays are distinct from CPUs, GPUs, and ASICs

FPGAs coexist with other processors because each one optimizes different priorities. Central processing units excel at general computing. Graphics processing units, designed to perform many calculations simultaneously, dominate large parallel workloads such as AI training. ASICs provide maximum efficiency when designs remain stable and production volumes are high.

“ASICs can deliver the best performance, but the development cycle is long, and the nonrecurring engineering cost can be very high. FPGAs provide a sweet spot between processors and custom silicon.” —Jason Cong, IEEE Fellow and professor of computer science at UCLA.

“FPGAs are not replacements for CPUs or GPUs,” Cong says. “They complement those processors in heterogeneous computing systems.”

Modern computing platforms increasingly combine multiple types of processors to balance flexibility, performance, and energy efficiency.

Advertisement

A Milestone for an idea, not just a device

This IEEE Milestone recognizes more than a successful semiconductor product. It also acknowledges a shift in how engineers innovate.

Reconfigurable hardware allows designers to test ideas quickly, refine architectures, and deploy systems while standards and markets evolve.

“Without FPGAs,” Cong says, “the pace of hardware innovation would likely be much slower.”

Four decades after the first FPGA appeared, the technology’s enduring legacy reflects Freeman’s insight: Hardware did not need to remain fixed. By accepting a small amount of unused silicon in exchange for adaptability, engineers transformed chips from static products into platforms for continuous experimentation—turning silicon itself into a medium engineers could rewrite.

Advertisement

Among those who attended the Milestone ceremony were 2025 IEEE President Kathleen Kramer; 2024 IEEE President Tom Coughlin; Avery Lu, chair of the IEEE Santa Clara Valley Section; and Brian Berg, history and milestones chair of IEEE Region 6. They joined AMD’s chief executive, Lisa Su, and Salil Raje, senior vice president and general manager of adaptive and embedded computing at AMD.

The IEEE Milestone plaque honoring the field-programmable gate array reads:

The FPGA is an integrated circuit with user-programmable Boolean logic functions and interconnects. FPGA inventor Ross Freeman cofounded Xilinx to productize his 1984 invention, and in 1985 the XC2064 was introduced with 64 programmable 4-input logic functions. Xilinx’s FPGAs helped accelerate a dramatic industry shift wherein ‘fabless’ companies could use software tools to design hardware while engaging ‘foundry’ companies to handle the capital-intensive task of manufacturing the software-defined hardware.”

Administered by the IEEE History Center and supported by donors, the IEEE Milestone program recognizes outstanding technical developments worldwide that are at least 25 years old.

Advertisement

Check out Spectrum’s History of Technology channel to read more stories about key engineering achievements.

From Your Site Articles

Related Articles Around the Web

Source link

Advertisement
Continue Reading

Tech

Electrical Current Might Be the Key To a Better Cup of Coffee

Published

on

An anonymous reader quotes a report from Ars Technica: University of Oregon chemist Christopher Hendon loves his coffee — so much so that studying all the factors that go into creating the perfect cuppa constitutes a significant area of research for him. His latest project: discovering a novel means of measuring the flavor profile of coffee simply by sending an electrical current through a sample beverage. The results appear in a new paper published in the journal Nature Communications.

[…] The coffee industry typically uses a method for measuring the refractive index of coffee — i.e., how light bends as it travels through the liquid — to determine strength, but it doesn’t capture the contribution of roast color to the overall flavor profile. So for this latest study, Hendon decided to focus on roast color and beverage strength, the two variables most likely to affect the sensory profile of the final cuppa. His solution turned out to be quite simple. Hendon repurposed an electrochemical tool called a potentiostat, typically used to test battery and fuel cell performance. Hendon used the tool to measure how electricity interacted with the liquid. He found that this provided a better measurement of the flavor profile. He even tested it on four different samples of coffee beans and successfully identified the distinctive signature of a batch that had failed the roaster’s quality-control process.

Granted, one’s taste in coffee is fairly subjective, so Hendon’s goal was not to achieve a “perfect” cup but to give baristas a simple tool to consistently reproduce flavor profiles more tailored to a given customer’s taste. “It’s an objective way to make a statement about what people like in a cup of coffee,” said Hendon. “The reason you have an enjoyable cup of coffee is almost certainly that you have selected a coffee of a particular roast color and extracted it to a desired strength. Until now, we haven’t been able to separate those variables. Now we can diagnose what gives rise to that delicious cup.” Outside of his latest electrical-current experiment, Christopher Hendon’s coffee research has shown that espresso can be made more consistently by modeling extraction yield — how much coffee dissolves into the final drink — and controlling water flow and pressure.

He also found that static electricity from grinding causes fine coffee particles to clump, which disrupts brewing. The solution: adding a small squirt of water to beans before grinding (known as the Ross droplet technique) to reduce that static, cut clumping and waste, and lead to a stronger, more consistent espresso.

Advertisement

Source link

Continue Reading

Tech

Albert King and Eddie Kirkland Vinyl and Digital Reissues Announced by Craft Recordings and Bluesville Records: Try Not to Miss Them Again

Published

on

Blues doesn’t always get the same glossy reissue treatment or attention as jazz, but that says more about the market than the music. The truth is, blues is just as essential to the American story, and for a lot of listeners, it hits harder and feels more direct. You don’t have to decode it. You feel it. Audiences just proved that again with Sinners. They showed up for the rawness, the history, and the emotional truth that blues has always carried without apology.

And if we’re being honest, some of us connect to that more than jazz, which can feel repetitive and a little too polite when what you really want is some level of truth as you take another sip of your drink and second guess not calling her back or remember exactly why you loved or hated her in the first place.

Now Craft Recordings and its Bluesville Records imprint are doubling down on that legacy with two reissues that don’t need a sales pitch, just a turntable. Albert King’s I’ll Play the Blues for You from 1972 and Eddie Kirkland’s It’s the Blues Man! from 1962 arrive June 12, 2026 with AAA remastering from the original analog tapes by Matthew Lutthans at The Mastering Lab. Both titles are pressed on 180 gram vinyl at Quality Record Pressings in partnership with Acoustic Sounds, housed in tip on jackets, and include obi strips with new notes by Scott Billington.

These are not museum pieces or background music for suburban wine tastings with a fleet of Range Rover driving Karens pretending they “get the blues.” Their idea of the blues is the Starbucks app going down mid order, the Ozempic shot disappearing between the seats, or someone cutting them off at H-E-B for the last bag of jerky meant for a hypoallergenic dog.

Advertisement

Albert King’s title track still cuts deep, and Eddie Kirkland’s “Saturday Night Stomp,” featuring King Curtis, has more life than most modern recordings that spend thousands trying to fake it. The reissues will also be available across digital platforms in hi-res and standard formats, with both key tracks already streaming.

Albert King’s I’ll Play the Blues for You Returns: Stax Era Fire, Memphis Muscle, No Apologies

albert-king-play-blues-lp

Often lumped in with the other “Kings of the Blues” like Freddie King and B.B. King, Albert King didn’t just belong in that conversation, he helped define it. Born in Mississippi in 1923, self taught, and eventually landing in Memphis after stops in Gary and St. Louis, King built a sound that didn’t ask for permission. His Gibson Flying V didn’t whisper, it testified. The voice was just as unmistakable. Deep, worn, and completely uninterested in sounding pretty.

Signing with Stax Records changed everything. Backed by one of the tightest in house crews in the business, King hit a run that most artists never get close to. “Laundromat Blues,” “Crosscut Saw,” and “Born Under a Bad Sign” were not just hits, they were statements. Blues that moved, grooves that hit, and songs that actually meant something.

By the time I’ll Play the Blues for You landed in 1972, King wasn’t chasing relevance. He already had it. Produced by Allen Jones, the album leans into a funkier, more modern feel without losing the grit. The Bar-Kays and The Movement handle the rhythm section with zero wasted motion, while The Memphis Horns bring the kind of punch that makes everything feel bigger without turning it into a circus.

The tracks stretch out because they need to. “I’ll Play the Blues for You” and “Breaking Up Somebody’s Home” both push past seven minutes, not for show, but because that’s how long it takes to say something real. The latter cracked the Hot 100 and landed in the R&B Top 40, but chart positions almost feel beside the point here. In 2017, the title track was inducted into the Blues Hall of Fame, which is nice, but anyone who’s actually heard it already knew.

Advertisement

Where to pre-order: $37 at Amazon (Available June 12, 2026)


Eddie Kirkland’s It’s the Blues Man! Returns: Raw, Road-Tested, and Cut at Van Gelder’s Peak

cr00980-eddie-kirkland-its-the-blues-lp

Eddie Kirkland didn’t come up through conservatories or polite circles. Born in Jamaica in 1923 and raised in Alabama, he learned everything the hard way and then took it on the road. A lot. As a teenager, he headed to Detroit and spent more than a decade playing alongside John Lee Hooker, which is about as real an education as it gets. In the early ’60s, he linked up with Otis Redding, serving as guitarist and bandleader. Not exactly a side gig you stumble into.

Advertisement. Scroll to continue reading.

Somewhere between all that mileage, Kirkland managed to carve out his own lane. In 1962, he dropped It’s the Blues Man! on Tru Sound, a short lived offshoot of Prestige Records. It didn’t come with hype or a marketing machine. It came with intent.

The session was engineered by Rudy Van Gelder, which tells you everything about the sound before the needle even drops. It’s immediate, punchy, and alive. Kirkland is backed by King Curtis and his band, and Curtis keeps things locked in with that unmistakable blend of R&B swing and street level grit. No excess, no filler.

Advertisement

Kirkland moves easily between styles because he actually lived them. “Train Done Gone” hits with purpose, “Man of Stone” locks you into its groove, and yeah, John Mayall covered it later on Crusade, which should tell you something. Then he pivots into slower cuts like “I’m Gonna Forget You” and “Have Mercy on Me” and reminds you that restraint can hit just as hard when it’s done right.

Where to pre-order: $37 at Amazon (Available June 12, 2026)

Source link

Advertisement
Continue Reading

Tech

App Store policy must change as stay reversed

Published

on

Apple will have to comply with previous mandates as it takes its fight with Epic Games back to the Supreme Court, so expect App Store changes soon.

The Apple vs Epic saga is years long and could easily fill a book at this point, but it hasn’t ended yet. The latest update comes after Apple won a stay against enforcing App Store changes as it appealed the Supreme Court.

That stay was short-lived, as Epic immediately appealed the stay and 9to5Mac shared that it has won. The US Ninth Circuit Court has reversed the stay it placed on enforcing a mandate that would require Apple to change how it charges developers for external purchases.

Basically, Apple won on every count in the Epic lawsuit except one. It was ordered to end its anti-steering rules and allow external purchases.

Advertisement

Apple complied, but its new setup for external commissions was constructed in a way that wouldn’t make it worthwhile for developers to adopt. Apple was found in contempt of the order, and an injunction was filed to force Apple to allow external purchases with zero commission.

The injunction was appealed again and again, and eventually an agreement was reached that Apple should be allowed to charge a commission, just not 27%. A later ruling said that Apple and Epic must decide on what would be acceptable, but that hasn’t happened yet.

Apple was taking the case to the Supreme Court again and requested that the negotiations over a new fee and more App Store changes be stayed. It argued that there would be no need for lower court involvement until the Supreme Court appeal was done, and that stay was granted.

Epic appealed that stay order, and that’s where we are today. Even as Apple appeals to the Supreme Court, it will have to go back to the lower courts and work out the new commission structure.

Advertisement

Epic Games CEO Tim Sweeney took to social media to celebrate.

That’s quite the fall from wanting free and open access to the App Store user base. Even as Epic “wins,” Apple still gets to collect its dues.

Apple has the power to end this

Given that the case was refused at the Supreme Court already, it doesn’t seem like things will go Apple’s way. The company may not be able to charge as much as it wants, but at least the courts have agreed it is owed something.

Advertisement

All of these regulatory cases around the world can’t be avoided when you’re as big as Apple. However, I fully believe that Apple could reduce the pain if it wanted to.

It is well within Apple’s power and resources to come up with a new App Store commission system that would still earn it plenty of money, that governments would approve of, and only a few developers might sneer at. Epic will never be satisfied short of running Apple’s App Store itself for all of the profit, but others would be happy with more revenue.

This ongoing epic began in 2020 with Epic purposefully violating App Store policy so it could goad Apple into a lawsuit. The entire campaign was pitched as Epic taking on big bad Apple and even came with a 1984-style advert.

Like with Spotify and other giants that take on Apple, it’s about maximizing their bottom line while leeching off of Apple’s user base. Users might benefit in the long run, but Epic has paid more than a billion dollars for what could be considered rather small victories.

Advertisement

Source link

Advertisement
Continue Reading

Trending

Copyright © 2025