Home / Channel / Exclusive Interview with AMD’s Sasa Marinkovic: Top 8 Trends in Computing

Exclusive Interview with AMD’s Sasa Marinkovic: Top 8 Trends in Computing

I recently had some time to sit down with AMD's Sasa Marinkovic -the Head of Technology Marketing, in the Client Business Unit. I have always enjoyed chatting with Sasa, he is a very intelligent, focused executive with a clear strategy for the future.

So much is happening simultaneously in the realm of personal computing that simply staying abreast of the popular labels for the latest technology trends can be just as challenging as understanding the concepts and grasping the implications.

We asked Sasa to give a quick overview of some of the major technology trends that are shaping the computing world. Here are Sasa’s top eight computing trends, ordered without regard to relative importance:
Sasa_headshot_1

1. Cloud Computing
One of the most popular— and overused — computing terms over the past decade is “cloud computing.” Like having a “mainframe in the sky,” cloud computing has profoundly empowered users by harnessing the massive processing power of thousands of servers for their individual computing needs.

Users can now run an application, store data, or perform almost any computing task in the cloud instead of using the limited resources of their personal computer — and they can do it from anywhere in the world. Users new to the concept of cloud computing were a bit hesitant to embrace it a few years ago, but today’s cloud computing has become part of the fabric of everyday computing, and the strong foundation for the next generation of computing itself.

2. Connected Computing
Connected computing considers any object operated using a power source — even something as simple as a lightbulb — as being “connected” to the Internet of Things (IoT).

The massive amounts of data generated by billions of connected devices will make today’s “Big Data” seem small in comparison, and will be far beyond human capabilities to monitor, analyze, and control. Advances and innovations in compute performance and machine-to-machine communications will eliminate the need for humans acting as the primary creators and “routers” of information.

3. Contributing Computing
Contributing computing collectively harnesses the “contributed” compute processing power of hundreds of thousands of computer servers, or millions of connected IoT computing devices. Imagine the immense processing power if every connected desktop PC, notebook, tablet, and smartphone could share their processing capabilities.

There are already a number of applications already doing this, including Folding at Home, digital currency mining, and several others. The next-generation “Internet of Things” assumes that all connected compute devices will ultimately share data and contribute processing power — a key element of tomorrow’s supercomputing.

4. Supercomputing
Harnessing as many standalone processors together to process massive amounts of data generated by billions of connected devices may prove to be the fastest way to generate unprecedented levels of “supercomputing” processing capacity.

Vastly more powerful supercomputers are badly needed for modeling and predicting climate changes, supporting medical modeling for personalized medicine, creating new drugs in response to rapidly spreading viruses, improving efficiency in aerodynamics and industrial design, developing controlled fusion, exploring new forms of clean energy, and more.
 
5. Visual Computing
The low-resolution VGA computer monitors of two decades ago have quickly evolved into high-resolution 4K displays, with pixel densities more than doubling by packing ever-greater resolutions into ever-smaller displays. Pushing more on-screen pixels at higher frame rates requires vastly more powerful graphics processing units (GPUs).

The GPUs of a decade ago delivered about 100 GFLOPs of processing performance, but the processing performance today’s GPUs has increased exponentially (now measured in teraFLOPS) in order to maintain visual quality and smooth video playback. The role of graphics and parallel processing becomes even more important when discussing the “visual computing” applications and user experiences delivered by high-definition displays and video, gaming and videochat, and next-generation user interfaces involving virtual and augmented reality.

6. Heterogeneous Computing
A perfect example of the new breed of heterogeneous processor is an APU — an advanced processing unit combining the serial-processing capabilities of a traditional CPU with the parallel-processing capabilities of advanced graphics processing units (GPUs) on a single chip.

Working in harmony, the heterogeneous elements of an APU deliver massive compute capability to everyday computing tasks. Today’s APUs boost processing performance beyond 750 GFLOPs — a benchmark approaching supercomputing territory.
 
7. Secure Computing
The fact that computer security threats like the recent Heartbleed virus are now sporting their own custom graphic logo in news reports is a subtle indicator of today’s top-of-mind public awareness regarding computing security and safety. Building security directly into the silicon is a simple and cost-effective approach. Security technologies enabling safer online transactions and mobile payments will built into SoC products.
 
8. Ambidextrous Computing
The vast majority of notebooks, desktops, and servers available today run on x86 processor architecture. Similarly, most of today’s tablets and smartphone are running on ARM processor architectures. Both architectures have introduced innovations that create new possibilities for the technology industry.

But imagine a computing world where x86 and ARM ecosystems work together in heterogeneous computing harmony. All the technology trends mentioned above absolutely depend on a new level of close cooperation and collaboration between today’s leading hardware ecosystems. The good news: it’s already happening.

Sasa added “These major trends are driving a new era of computing, and are profoundly influencing how devices operate and cooperate, and how we control and interact with them. The Internet of Things is driving the creation of Big Data, anticipatory analytics, and deep learning.

Sleek new form factors are driving ultra-portability, and new user interfaces are moving us into the immersive world of virtual and augmented reality.

With game-changing new technologies including HSA, Mantle, GCN, PSP (platform security processor) and TrueAudio, I believe AMD is helping to lead the industry into this next era of computing — and deliver amazing and empowering new user experiences to our customers.”

Discuss on our Facebook page, over HERE.

I would like to thank Sasa for taking the time to chat with us this week – Allan Campbell, Editor In Chief, Kitguru.

Become a Patron!

Check Also

EKWB Whistleblower Dan Henderson speaks to KitGuru

Following on from our recent interview with EKWB's CEO, Leo is now getting the other side of the story, straight from Dan Henderson himself, the one who initially acted as the 'whistleblower' for EKWB's internal issues.

4 comments

  1. Interesting read thanks

  2. While this was a good read from a few angles, I do have a complete lack of faith in MANTLE. It just is such a good idea, but hardly supported. Take Thief for instance – full mantle support, and a broken crossfire profile since it released,

    They are slightly disorganised as a team IMO, with some great individuals in the mix.

  3. AMD are much stronger this year than last. The R9 295X2 shows they can bat with the best of them and they are focusing on MANTLe a lot more (I don’t entirely agree with Union Flag, BF4 Mantle support is fantastic and surely it must come from the developer, not the company.) That said MANTLE may have a problem going forward as Nvidia invest a lot more money into TWIMTBP games than AMD, so they will be fighting against a developer who gets a handout from the guys in green.

    Some good products – I like their APU’s , but their high end desktop processors are seriously lacking. They need to focus more on the high end again and compete with Intel. They can fight against Nvidia which is good.