Tech Talk iconTech Talk

Users aboard 5G Champion's demo bus watch an ice hockey game streaming at 5 Gbps from a nearby 5G basestation.

First Intercontinental 5G Trial Begins at Winter Olympics

5G report logo, link to report landing page

Olympics fans arriving at South Korea’s Gangneung Station on their way to the coastal ice arenas this week are getting a sneak peek at 5G Champion, a pioneering mobile-broadband project two years in the making. This joint EU-Korea venture—led by France’s CEA-Leti and South Korea’s Electronics and Telecommunications Research Institute—stands out as a quiet contender in what the Games’ official telecommunications sponsor, KT Corp., has dubbed the “first 5G Olympics.”

Much ado has been made of KT’s own widely-publicized demos, and in pizazz, they did not disappoint. After deploying its 5G trial network at the opening ceremony on Feb. 9 to synchronize in real time 1,200 flickering LED candles forming a giant dove, KT continues to dazzle spectators with display tablets and virtual-reality glasses live-streaming its vision for a 5G future: immersive footage from ski courses and bobsleigh cockpits; 360-degree close-ups of speed skaters and ice dancers; VR trips to hockey games and snowboarding runs.

Like these trial services and others popping up around the globe, 5G Champion (of which KT is one of 21 industry partners) makes use of the plethora of spectrum available in millimeter-wave frequencies to boost data rates and lower latency. 5G Champion’s prototype radios, for instance, operate in a 1-Gigahertz-wide band around 28 Gigahertz—10 times the maximum spectrum available to today’s 4G networks.

Read More
Blue and white data symbols traveling to a lit keyhole in a swi

New Quantum Crypto Scheme Looks Ahead to "Quantum Internet"

Chinese researchers have put forward a new quantum cryptography standard that could, if confirmed, substantially increase the speed of encrypted messages. The proposed new standard has been simulated on computers although not yet tested in the lab.

Quantum cryptography, the next-generation of secret messages whose secrecy is guaranteed by the laws of quantum mechanics, has been in the news recently. Last fall a group from the Chinese Academy of Sciences transmitted quantum cryptographically encoded communications (via satellite) to a ground station in Vienna, Austria.

The communications included quantum-encoded images and a 75-minute quantum-cryptographically secured videoconference, consisting of more than 2 gigabytes of data. IEEE Spectrum reported on the event at the time. And now, as of last month, the entire project has been detailed in the journal Physical Review Letters.

Read More
Room where Mitsubishi 16 beam test is being conducted

Mitsubishi Electric Develops Hybrid 16-beam Spatial-Multiplexing Technology for 5G Base Stations

5G report logo, link to report landing page

With mobile traffic in the coming 5G era expected to be a thousand times greater than what we’re generating today, mobile wireless infrastructure companies will need to provide greater transmission capacity, lower latency, and vastly more connectivity. To help achieve these goals, researchers at Mitsubishi Electric are testing a hybrid super-high-frequency massive multiple-input multiple-output (MIMO) system using hundreds of antenna elements with multibeam multiplexing to achieve efficient spectrum usage.

On 14 February, the company announced the development of a 16-beam spatial-multiplexing technology operating at 28 gigahertz for 5G small mobile base stations. What’s more, Mitsubishi claimed, is that it had demonstrated what it believes is the first 5G system to transmit 25.5 gigabits per second to one user device using the 500 megahertz bandwidth.

Details of the system will be announced at the IEICE Technical Committee on Radio Commutation System conference on 28 February.

The prototype base station used in the test consists of eight analog front-end-processing low-power units that together formed 16 beams, plus a MIMO digital processing algorithm that reduced interference between the beams.

The system attained a gain of 4096 antenna elements, yet its computational complexity is just that of 16 antenna elements, explains Atsushi Okamura, general manager of the Communication Technology Department, a unit in Mitsubishi Electric's Information Technology R&D Center in Kamakura, just south of Tokyo.

While all-digital massive MIMO produces high transmission performance, Okamura notes that it requires a digital signal processor, a digital-to-analog converter, and analog circuitry for each antenna. This would result in extremely high implementation and computation costs, not to mention a prohibitive increase in size.

"So we have implemented a hybrid beamforming system using active phased-array antenna and digital MIMO signal processing," he explains. This dramatically reduces number of components, yet yields almost the same performance, he adds.

That’s because each antenna element constitutes a sub-array and employs an analog variable-phase-shifter for controlling beam direction. For example, if the number of antenna elements is 4096, and the number of elements per sub-array is 256, then the number of sub-arrays is 16.

As one beam is formed by analog phase shifters in the sub-array, the hybrid beamforming system requires digital signal processing for only 16 beams. The prototype base station Mitsubishi tested employed eight 2-beam massive (512 antenna elements) MIMO RF modules; these included two active-phased-array-antenna units operating at 28 GHz. The thickness of a module is 7 cm, about one-third that of the previous prototype module.

When testing the system, the researchers achieved a parallel transmission of 16 streams to a single device in a line-of-sight test in an anechoic chamber. A spectral efficiency of 63.7 bits per second per hertz and a download speed of 25.5 Gbps were recorded, which Mitsubishi believes is an industry first as of 14 February.

Close-up of a Imac chip

Imec Boosts Bluetooth Battery Life

A Bluetooth transceiver design that dramatically boosts battery life could enable richer sensor networks and extend the lifetime of implanted medical devices. At the International Solid-State Circuits Conference in San Francisco this week, engineers from European research organization imec and Renesas Electronics Corporation (a semiconductor company in Tokyo) showed off the record-low-voltage communications chip.

Over the past eight years, engineers have brought down Bluetooth power consumption by a factor of ten, says Christian Bachmann, program manager for ultralow power wireless systems at imec Holst Centre in Eindhoven, Netherlands. The imec transceiver, which meets the Bluetooth 5 standard, uses 0.8 volts, down from a full volt. That reduction is enough to extend battery life by 50 percent. “This achieves another power of five reduction and will enable new applications,” Bachmann says.

One way the imec-Renesas group managed to trim power requirements was by switching out analog circuits for digital ones. Bachmann says the last few years have seen a lot of innovation in digital radio designs, and the imec group took full advantage. Digital logic is not only more reliable and compact than analog counterparts, it’s miserly in its use of power. One significant switch to digital in the Bluetooth transceiver was in a control circuit called a phase-locked loop. The digital version offers better control, says Bachmann. The team also made architectural changes, including ditching an entire block of analog-to-digital converters in the receiver. Typical systems require two sets in order to ensure quality of the signal; the imec-Renesas converter works with high enough fidelity that only one is needed.

Bachmann is excited about the potential for ultralow-power communications not only to extend battery life in conventional applications, but also to open up new ones. “For wireless sensor networks, communications are the power bottleneck,” says Bachmann. Power-hungry transceivers can rule out the use of low-voltage printed batteries and energy harvesters. More efficient transceivers could open up new possibilities for wearable electronics and distributed sensor networks.

Virtual Kitchen form the AI2-THOR

AI2-THOR Interactive Simulation Teaches AI About Real World

Training a robot butler to make the perfect omelette could require breaking a lot of eggs and throwing out many imperfect attempts in a real-life kitchen.

That’s why researchers have been rolling out virtual training grounds as a more efficient alternative to putting AI agents through costly and time-consuming experiments in the real world.

Read More
Images showing 21.3: 32GHz Resonant-Fin Transistors  in 14nm FinFET Technology

FinFETs Shimmy to 5G’s Frequencies

5G report logo, link to report landing page

Engineers at Purdue University and GlobalFoundries have gotten today’s most advanced transistors to vibrate at frequencies that could make 5G phones and other gadgets smaller and more energy efficient. The feat could also improve CPU clocks, make wearable radars, and one day form the basis of a new kind of computing. They presented their results today at the IEEE International Solid-States Circuits Conference, in San Francisco.

Read More
Photograph of the personal GPS boot created at the University of Utah.

Accurate Navigation Without GPS

The global positioning system can locate you within 5 to 10 meters anywhere on Earth—as long as your receiver is in the line of sight of multiple satellites. Getting location information indoors is tricky. A team at the University of Utah has now put the solution underfoot: A suite of sensors and circuits mounted to a boot can determine position with an accuracy of about 5 meters, indoors or out, without GPS.

The navigation system, installed in a very hefty prototype boot, could help rescue workers navigate inside buildings, and show firefighters where their team members are. It might also be integrated with virtual or augmented reality games. The Utah researchers presented their GPS-free navigation system on Tuesday at the International Solid-State Circuits Conference in San Francisco.

Read More
Back of emulator showing wires

MilliLabs Ignores Industry Skepticism to Build Emulator for Millimeter Waves

5G report logo, link to report landing page

Throughout its development, 5G has been plagued by a simple problem: Is there a way for engineers to test millimeter wave propagation without committing to expensive and complex methods? The founders of one startup, MilliLabs, say they’ve found a solution.

“Everyone was doing over-the-air testing and we’re saying, ‘Dude, that’s nuts,’” says Aditya Dhananjay, the co-founder and president of MilliLabs. It was widely assumed that emulating millimeter waves was, for all intents and purposes, impossible.

Prior to 5G development, standard industry procedure was to use channel emulators to quickly gather a large amount of general data on the technologies being developed, before conducting more refined, over-the-air tests.

Read More
Microscope image of the integrated HEMT-LED device

New Device Could Drive MicroLED Displays, Li-Fi

A new device could make upcoming microLED displays easier to engineer and visible light communications systems, like LiFi, faster.

As IEEE Fellow Kei May Lau sees it, the problem with conventional LEDs, which are current controlled devices, is that turning them on and off rapidly to control brightness or using them for Li-Fi takes careful engineering and a bunch of circuitry.

“Most IC designers would rather work with voltage control device, but LEDs are current controlled devices,” says Lau. The combination of an LED’s high current and low voltage requirements makes designing drivers for them troublesome.

So she and her students invented a device, the HEMT-LED, that makes it much easier. The HEMT-LED, which is a bit like a light emitting transistor, lets you switch light emission on and off and control brightness with voltage signals.

Read More
Photo of bunnie Huang in front of computers.

How to Design a New Chip on a Budget

We recently had an interesting exchange with bunnie Huang, hardware guru and creator of Chumby, NetTV, and the Novena laptop, among other things. He’s also the author of Hacking the Xbox, The Essential Guide to Electronics in Shenzhen, and not one but two feature articles in IEEE Spectrum.

We were interested in Huang’s views about whether a small, modestly funded team—say a college-dorm startup—could produce a custom chip, just the way such groups now create board-level products and software with ease. 

Software ventures in particular benefit from the vast amount of open-source code that is available for use in building commercial products. (One study found that the average commercial application contains 35 percent open-source code.) We wanted to get a sense of whether chip designers also enjoyed a rich ecosystem of open-source building blocks.

Or is chip design still so closed and so challenging that it’s really just for large, established companies?

Read More
Advertisement

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Advertisement
Load More