Advancements and Challenges in Networking Technologies: A Comprehensive Survey
Ieee account.
- Change Username/Password
- Update Address
Purchase Details
- Payment Options
- Order History
- View Purchased Documents
Profile Information
- Communications Preferences
- Profession and Education
- Technical Interests
- US & Canada: +1 800 678 4333
- Worldwide: +1 732 981 0060
- Contact & Support
- About IEEE Xplore
- Accessibility
- Terms of Use
- Nondiscrimination Policy
- Privacy & Opting Out of Cookies
A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.
- Artificial Intelligence
- Generative AI
- Cloud Computing
- CPUs and Processors
- Data Center
- Edge Computing
- Enterprise Storage
- Virtualization
- Enterprise Buyer’s Guides
- Internet of Things
- Network Management Software
- Network Security
- United States
- Spotlight: AI in Enterprise
- Newsletters
- Foundry Careers
- Terms of Service
- Privacy Policy
- Cookie Policy
- Copyright Notice
- Member Preferences
- About AdChoices
- E-commerce Links
- Your California Privacy Rights
Our Network
- Computerworld
25 of today’s coolest network and computing research projects
Latest concoctions from university labs include language learning website, a newfangled internet for mobile devices and even ip over xylophones.
University labs, fueled with millions of dollars in funding and some of the biggest brains around, are bursting with new research into computer and networking technologies.
ALPHADOGGS: Follow our and Facebook page
networks, computer and a general focus on shrinking things and making them faster are among the hottest areas, with some advances already making their way into the market. Here’s a roundup of 25 such projects that caught our eyes:
This free website, Duolingo, from a pair of Carnegie Mellon University computer scientists serves double duty: It helps people learn new languages while also translating the text on Web pages into different languages.
CMU’s Luis von Ahn and Severin Hacker have attracted more than 100,000 people in a beta test of the system, which initially offered free language lessons in English, Spanish, French and German, with the computer offering advice and guidance on unknown words. Using the system could go a long way toward translating the Web, many of whose pages are unreadable by those whose language skills are narrow.
Von Ahn is a veteran of such crowdsourcing technologies, having created online reCAPTCHA puzzles to cut down on spam while simultaneously digitizing old books and periodicals. Von Ahn’s spinoff company, reCAPTCHA, was acquired by Google in 2009. Duolingo, spun off in November to offer commercial and free translation services, received $3.3 million in funding from Union Square Ventures, actor Ashton Kutcher and others.
Princeton University Computer Science researchers envision an Internet that is more flexible for operators and more useful to mobile users. Princeton’s Serval system is what Assistant Professor of Computer Science Michael Freedman calls a Service Access Layer that sits between the IP Network Layer (Layer 3) and Transport Layer (Layer 4), where it can work with unmodified network devices. Serval’s purpose is to make Web services such as Gmail and Facebook more easily accessible, regardless of where an end user is, via a services naming scheme that augments what the researchers call an IP address set-up “designed for communication between fixed hosts with topology-dependent addresses.” Data center operators could benefit by running Web servers in virtual machines across the cloud and rely less on traditional load balancers.
Serval, which Freedman describes as a “replacement” technology, will likely have its first production in service-provider networks. “Its largest benefits come from more dynamic settings, so its features most clearly benefit the cloud and mobile spaces,” he says.
If any of this sounds similar to software-defined networking (SDN), there are in fact connections. Freedman worked on an SDN/OpenFlow project at Stanford University called Ethane that was spun out into a startup called Nicira for which VMware recently plunked down $1.26 billion.
WiFi routers to the rescue
Researchers at Germany’sTechnical University in Darmstadt have described a way for home Wi-Fi routers to form a backup mesh network to be used by the police, firefighters and other emergency personnel in the case of a disaster or other incident that wipes out standard cell and phone systems.
The proliferation of Wi-Fi routers makes the researchers confident that a dense enough ad hoc network could be created, but they noted that a lack of unsecured routers would require municipalities to work with citizens to allow for the devices to be easily switched into emergency mode. The big question is whether enough citizens would really allow such access, even if security was assured.
Hyperspeed signaling
University of Tulsa engineers want to slow everything down, for just a few milliseconds, to help network administrations avoid cyberattacks.
By slowing traffic, the researchers figure more malware can be detected and then headed off via an algorithm that signals at hyperspeed to set up defenses. Though researcher Sujeet Shenoi told the publication New Scientist that it might not be cheap to set up such a defense system, between the caching system and reserved data pipes needed to support the signals.
Control-Alt-Hack
University of Washington researchers have created a card game called Control-Alt-Hack that’s designed to introduce computer science students to security topics.
The game, funded in part by Intel Labs and the National Science Foundation, made its debut at the Black Hat security conference in Las Vegas over the summer. The tabletop game involves three to six players working for an outfit dubbed Hackers, Inc., that conducts security audits and consulting, and players are issued challenges, such as hacking a hotel mini bar payment system or wireless medical implant, or converting a robotic vacuum cleaner into a toy. The game features cards (including descriptions of well-rounded hackers who rock climb, ride motorcycles and do more than sit at their computers), dice, mission cards, “hacker cred tokens” and other pieces, and is designed for players ages 14 and up. It takes about an hour to play a game. No computer security degree needed.
“We went out of our way to incorporate humor,” said co-creator Tamara Denning, a UW doctoral student in computer science and engineering, referring to the hacker descriptions and challenges on the cards. “We wanted it to be based in reality, but more importantly we want it to be fun for the players.”
Ghost-USB-Honeypot project
This effort, focused on nixing malware like Flame that spreads from computer to computer via USB storage drives, got its start based on research from Sebastian Poeplau at Bonn University’s Institute of Computer Science. Now it’s being overseen by the broader Honeynet Project.
The breakthrough by Poeplau and colleagues was to create a virtual drive that runs inside a USB drive to snag malware . According to the project website: “Basically, the honeypot emulates a USB storage device. If your machine is infected by malware that uses such devices for propagation, the honeypot will trick it into infecting the emulated device.”
One catch: the security technology only works on XP 32 bit, for starters.
IP over Xylophone Players (IPoXP)
Practical applications for running IP over xylophones might be a stretch, but doing so can teach you a few things about the truly ubiquitous protocol.
A University of California Berkeley researcher named R. Stuart Geiger led this project, which he discussed earlier this year at the Association for Computing Machinery’s Conference on Human Factors in Computing Systems . Geiger’s Internet Protocol over Xylophone Players (IPoXP) provides a fully compliant IP connection between two computers. His setup uses a pair of Arduino microcontrollers, some sensors, a pair of xylophones and two people to play the xylophones.
The exercise provided some insights into the field of Human-Computer Interaction (HCI). It emulates a technique HCI specialists use to design interfaces called umwelt, which is a practice of imagining what the world must look like to the potential users of the interface. This experiment allowed participants to get the feel for what it would be like to be a circuit.
“I don’t think I realized how robust and modular the OSI model is,” Geiger said. “The Internet was designed for much more primitive technologies, but we haven’t been able to improve on it, because it is such a brilliant model.”
Making software projects work
San Francisco State University and other researchers are puzzling over why so many software projects wind up getting ditched, fail or get completed, but late and over budget. The key, they’ve discovered, is rethinking how software engineers are trained and managed to ensure they can work as teams.
The researchers, also from Florida Atlantic University and Fulda University in Germany, are conducting a National Science Foundation-funded study with their students that they hope will result in a software model that can predict whether a team is likely to fail. Their study will entail collecting information on how often software engineering students – teamed with students at the same university and at others — meet, email each other, etc.
“We want to give advice to teachers and industry leaders on how to manage their teams,” says Dragutin Petkovic, professor and chair of SF State’s Computer Science Department. “Research overwhelmingly shows that it is ‘soft skills,’ how people work together, that are the most critical to success.”
Ultra low-power wireless
Forget about 3G, 4G and the rest: University of Arkansas engineering researchers are focused on developing very low-power wireless systems that can grab data from remote sensors regardless of distortion along the network path.
These distortion-tolerant systems would enable sensors, powered by batteries or energy-harvesting, to remain in the field for long periods of time and withstand rough conditions to monitor diverse things such as tunnel stability and animal health. By tolerating distortion, the devices would expend less energy on trying to clean up communications channels.
“If we accept the fact that distortion is inevitable in practical communication systems, why not directly design a system that is naturally tolerant to distortion?” says Jingxian Wu, assistant professor of electrical engineering.
The National Science Foundation is backing this research with $280,000 in funding.
2-way wireless
University of Waterloo engineering researchers have developed a way for wireless voice and data signals to be sent and received simultaneously on a single radio channel frequency, a breakthrough they say could make for better performing, more easily connected and more secure networks.
“This means wireless companies can increase the bandwidth of voice and data services by at least a factor of two by sending and receiving at the same time, and potentially by a much higher factor through better adaptive transmission and user management in existing networks,” said Amir Khandani, a Waterloo electrical and computer engineering professor, in a statement. He says the cost for hardware and antennas to support such a system wouldn’t cost any more than for current one-way systems.
Next up is getting industry involved in bringing such technology into the standards process.
Next steps require industry involvement by including two-way in forthcoming standards to enable wide spread implementation.
The Waterloo research was funded in part by the Canada Foundation for Innovation and the Ontario Ministry of Research and Innovation.
Spray-on batteries
Researchers at Rice University in Houston have developed a prototype spray-on battery that could allow engineers to rethink the way portable electronics are designed.
The rechargeable battery boasts similar electrical characteristics to the lithium ion batteries that power almost every mobile gadget, but it can be applied in layers to almost any surface with a conventional airbrush, said Neelam Singh, a Rice University graduate student who led a team working on the technology for more than a year.
Current lithium ion batteries are almost all variations on the same basic form: an inflexible block with electrodes at one end. Because they cannot easily be shaped, they sometimes restrict designers, particularly when it comes to small gadgets with curved surfaces, but the Rice prototypes could change that. “Today, we only have a few form factors of batteries, but this battery can be fabricated to fill the space available,” said Singh.
The battery is sprayed on in five layers: two current collectors sandwich a cathode, a polymer separator and an anode. The result is a battery that can be sprayed on to plastics, metal and ceramics.
The researchers are hoping to attract interest from electronics companies, which Singh estimates could put it into production relatively easily. “Airburshing technology is well-established. At an industrial level it could be done very fast,” she said.
Mobile Mosh pit
Two MIT researchers formally unveiled over the summer a protocol called State Synchronization Protocol (SSP) and a remote log-in program using it dubbed Mosh (for mobile shell) that’s intended as an alternative to Secure Shell (SSH) for ensuring good connectivity for mobile clients even when dealing with low bandwidth connections. SSP and Mosh have been made available for free, on GNU/, FreeBSD and OS X, via an MIT website.
SSH, often used by network and system admins for remotely logging into servers, traditionally connects computers via TCP, but it’s that use of TCP that creates headaches for mobile users, since TCP assumes that the two endpoints are fixed, says Keith Winstein, a graduate student with MIT’s Computer Science and Artificial Intelligence Lab (CSAIL), and Mosh’s lead developer. “This is not a great way to do real-time communications,” Winstein says. SSP uses UDP, a connectionless, stateless transport mechanism that could be useful for stabilizing mobile usage of apps from Gmail to Skype.
Network Coding
Researchers from MIT, California Institute of Technology and University of Technology in Munich are putting network coding and error-correction coding to use in an effort to measure capacity of wired, and more challengingly, even small wireless networks (read their paper here for the gory details).
The researchers have figured out a way to gauge the upper and lower bounds of capacity in a wireless network. Such understanding could enable enterprises and service providers to design more efficient networks regardless of how much noise is on them (and wireless networks can get pretty darn noisy).
More details from MIT press office.
100 terahertz level
A University of Pittsburgh research team is claiming a communications breakthrough that they say could be used to speed up electronic devices such as and laptops in a big way. Their advance is a demonstrated access to more than 100 terahertz of bandwidth (electromagnetic spectrum between infrared and microwave light), whereas electronic devices traditionally have been limited to bandwidth in the gigahertz realm.
Researchers Hrvoje Petek of the University of Pittsburgh and visiting professor Muneaki Hase of the University of Tsukuba in Japan, have published their NSF-funded research findings in a paper in Nature Photonics. The researchers “detail their success in generating a frequency comb-dividing a single color of light into a series of evenly spaced spectral lines for a variety of uses-that spans a more than 100 terahertz bandwidth by exciting a coherent collective of atomic motions in a semiconductor silicon crystal.”
Petek says the advance could result in devices that carry a thousand-fold more information.
Separately, IBM researchers have developed a prototype optical chip that can transfer data at 1Tbps, the equivalent of downloading 500 high-definition movies, using light pulses rather than by sending electrons over wires.
The Holey Optochip is described as a parallel optical transceiver consisting of a transmitter and a receiver, and designed to handle gobs of data on corporate and consumer networks.
Cooling off with graphene
Graphene is starting to sound like a potential wonder material for the electronics business. Researchers from the University of California at Riverside, the University of Texas at Dallas and Austin, and Xiamen University in China have come up with a way to engineer graphene so that it has much better thermal properties. Such an isotopically-engineered version of graphene could be used to build cooler-running laptops, wireless gear and other equipment. The need for such a material has grown as electronic devices have gotten more powerful but shrunk in size.
“The important finding is the possibility of a strong enhancement of thermal conduction properties of isotopically pure graphene without substantial alteration of electrical, optical and other physical properties,” says UC Riverside Professor of Electrical Engineering Alexander Balandin, in a statement. “Isotopically pure graphene can become an excellent choice for many practical applications provided that the cost of the material is kept under control.”
Such a specially engineered type of graphene would likely first find its way into some chip packaging materials as well into photovoltaic solar cells and flexible displays, according to UC Riverside. Beyond that, it could be used with silicon in computer chips, for interconnect wiring to to spread heat.
Industry researchers have been making great strides on the graphene front in recent years. IBM, for example, last year said it had created the first graphene-based integrated circuit. Separately, two Nobel Prize winning scientists out of the U.K. have come up with a new way to use graphene – the thinnest material in the world – that could make Internet pipes feel a lot fatter.
Keeping GPS honest
Cornell University researchers are going on the offense against those who would try to hack GPS systems like those used in everything from cars to military drones to cellphone systems and power grids. Over the summer, Cornell researchers tested their system for outsmarting GPS spoofers during a Department of Homeland Security-sponsored demo involving a mini helicopter in the New Mexico desert at the White Sands Missile Range.
Cornell researchers have come up with GPS receiver modifications that allow the systems to distinguish between real and bogus signals that spoofers would use to trick cars, airplanes and other devices into handing over control. They emphasized that the threat of GPS spoofing is very real, with Iran last year claiming to have downed a GPS-guided American drone using such techniques.
Getting smartphones their ZZZZs
Purdue University researchers have come up with a way to detect smartphone bugs that can drain batteries while they’re not in use.
“These energy bugs are a silent battery killer,” says Y. Charlie Hu, a Purdue University professor of electrical and computer engineering. “A fully charged phone battery can be drained in as little as five hours.”
The problem is that app developers aren’t perfect when it comes to building programs that need to perform functions when phones are asleep and that use APIs provided by smartphone makers. The researchers, whose work is funded in part by the National Science Foundation, investigated the problem on Android phones, and found that about a quarter of some 187 apps contained errors that could drain batteries. The tools they’re developing to detect such bugs could be made available to developers to help them cut down on battery-draining mistakes.
Quantum leap in search
University of Southern California and University of Waterloo researchers are exploring how quantum computing technology can be used to speed up the math calculations needed to make Internet search speedy even as the gobs of data on the Web expands.
The challenge is that Google’s page ranking algorithm is considered by some to be the largest numerical calculation carried out worldwide, and no quantum computer exists to handle that. However, the researchers have created models of the web to simulate how quantum computing could be used to slice and dice the Web’s huge collection of data. Early findings have been encouraging, with quantum computers shown through the models to be faster at ranking the most important pages and improving as more pages needed to be ranked.
The research was funded by the NSF, NASA Ames Research Center, Lockheed Martin’s University Research Initiative and a Google faculty research award.
Sharing malware in a good way
Georgia Tech Research Institute security specialists have built a system called Titan designed to help corporate and government officials anonymously share information on malware attacks they are fighting, in hopes of fighting back against industrial espionage.
The threat analysis system plows through a repository of some 100,000 pieces of malicious code per day, and will give contributors quick feedback on malware samples that can be reverse-engineered by the Titan crew. Titan will also alert members of new threats, such as targeted spear-phishing attacks, and will keep tabs on not just Windows threats, but also those to MacIntosh and iOS, and Google Android systems.
“As a university, Georgia Tech is uniquely positioned to take this white hat role in between industry and government,” said Andrew Howard, a GTRI research scientist who is part of the Titan project . “We want to bring communities together to break down the walls between industry and government to provide a trusted, sharing platform.”
Touch-feely computing
Researchers from the University of Notre Dame, MIT and the University of Memphis are working on educational software that can respond to students’ cognitive and emotional states, and deliver the appropriate content based on how knowledgeable a student is about a subject, or even how bored he or she is with it.
AutoTutor and Affective AutoTutor get a feel for students’ mood and capabilities based on their responses to questions, including their facial expressions, speech patterns and hand movements.
“Most of the 20th-century systems required humans to communicate with computers through windows, icons, menus and pointing devices,” says Notre Dame Assistant Professor of Psychology Sidney D’Mello, an expert in human-computer interaction and AI in education . “But humans have always communicated with each other through speech and a host of nonverbal cues such as facial expressions, eye contact, posture and gesture. In addition to enhancing the content of the message, the new technology provides information regarding the cognitive states, motivation levels and social dynamics of the students.”
Mobile nets on the move
For emergency responders and others who need to take their mobile networks with them, even in fast-moving vehicles, data transmission quality can be problematic. North Carolina State University researchers say they’ve come up with a way to improve the quality of these Mobile ad hoc networks (MANET).
“Our goal was to get the highest data rate possible, without compromising the fidelity of the signal,” says Alexandra Duel-Hallen, a professor of electrical and computer engineering at NC State whose work is outlined in the paper “ Enabling Adaptive Rate and Relay Selection for 802.11 Mobile Ad Hoc Networks .”
The challenge is that fast moving wireless nodes make it difficult for relay paths to be identified by the network, as channel power tends to fluctuate much more in fast-moving vehicles. The researchers have come up with an algorithm for nodes to choose the best data relay and transmission paths, based on their experience with recent transmissions.
Tweet the Street
Researchers from the University of California, Riverside and Yahoo Research Barcelona have devised a model that uses data about volumes to predict how financial markets will behave. Their model bested other baseline strategies by 1.4% to 11% and outperformed the Dow Jones Industrial Average during a four-month simulation.
“These findings have the potential to have a big impact on market investors,” said Vagelis Hristidis, an associate professor at the Bourns College of Engineering. “With so much data available from social media, many investors are looking to sort it out and profit from it.”
The research, focused on what Twitter volumes, retweets and who is doing the tweeting might say about individual stocks, differs from that of earlier work focused on making sense of the broader market based on positive and negative sentiments in tweets.
As with so many stock-picking techniques, the researchers here tossed out plenty of caveats about their system, which they said might work quite differently, for example, during a period of overall market growth rather than the down market that their research focused on.
Franken-software
University of Texas, Dallas scientists have developed software dubbed Frankenstein that’s designed to be even more monstrous than the worst malware in the wild so that such threats can be understood better and defended against. Frankenstein can disguise itself as it swipes and messes with data, and could be used as a cover for a virus or other malware by stitching together pieces of such data to avoid antivirus detection methods.
“[Mary] Shelley’s story [about Dr. Frankenstein and his monster] is an example of a horror that can result from science, and similarly, we intend our creation as a warning that we need better detections for these types of intrusions,” said Kevin Hamlen, associate professor of computer science at UT Dallas who created the software, along with doctoral student Vishwath Mohan. “Criminals may already know how to create this kind of software, so we examined the science behind the danger this represents, in hopes of creating countermeasures.”
Such countermeasures might include infiltrating terrorist computer networks, the researchers say. To date, they’ve used the NSF and Air Force Office of Scientific Research-funded technology on benign algorithms, not any production systems.
Safer e-wallets
While e-wallets haven’t quite taken off yet, University of Pittsburgh researchers are doing their part to make potential e-wallet users more comfortable with the near-field communications (NRC) and/or RFID-powered technology.
Security has been a chief concern among potential users, who are afraid thieves could snatch their credit card numbers through the air. But these researchers have come up with a way for e-wallet credit cards to turn on and off, rather than being always on whenever in an electromagnetic field.
“Our new design integrates an antenna and other electrical circuitry that can be interrupted by a simple switch, like turning off the lights in the home or office,” says Marlin Mickle, the Nickolas A. DeCecco Professor of Engineering and executive director of the RFID Center for Excellence in the Swanson School. “The RFID or NFC credit card is disabled if left in a pocket or lying on a surface and unreadable by thieves using portable scanners.”
Mickle claims the advance is both simple and inexpensive, and once the researchers have received what they hope will be patent approval, they expect the technology to be adopted commercially.
Digging into Big Data
The University of California, Berkeley has been handed $10 million by the National Science Foundation as part of a broader $200 million federal government effort to encourage the exploration and better exploitation of massive amounts of information dubbed Big Data collected by far-flung wireless sensors, social media systems and more.
UC Berkeley has five years to use its funds for a project called the Algorithms, Machines and People (AMP) Expedition, which will focus on developing tools to extract important information from Big Data, such as trends that could predict everything from earthquakes to cyberattacks to epidemics.
“Buried within this flood of information are the keys to solving huge societal problems and answering the big questions of science,” said Michael Franklin, director of the AMP Expedition team and a UC Berkeley professor of electrical engineering and computer sciences, in a statement . “Our goal is to develop a new generation of data analysis tools that provide a quantum leap in our ability to make sense of the world around us.”
AMP Expedition researchers are building an open-source software stack called the Berkeley Data Analysis System (BDAS) that boasts large-scale machine-learning and data analysis methods, infrastructure that lets programmers take advantage of cloud and cluster computing, and crowdsourcing (in other words, human intelligence). It builds on the AMPLab formed early last year, with backing from Google, SAP and others.
Bob Brown tracks network research in his and Facebook page, as well on Twitter and Google + .
IDG News Service and other IDG publications contributed to this report
Related content
Cisco launches intelligent wi-fi 7 access points, 2024 global network outage report and internet health check, campus naas market set for growth with startups leading the charge, arista financials offer glimpse of ai network development, newsletter promo module test.
Bob Brown is the former news editor for Network World.
More from this author
Small cell forum seeks advice from large enterprises, how to make fully homomorphic encryption “practical and usable”, iphone 8 rumor rollup: cranking up the processors; 3d cameras; $1k-plus price, snl one step ahead of amazon with echo silver, you really should know what the andrew file system is, getting a jump on private lte networks, iphone 8 rumor rollup: tim cook cites rumors; lte shortfall; envisioning a function area, boring old enterprise vendors pay interns well, show me more, f5 gateway works to protect and manage ai applications.
IMAGES