Xilinx’s Versal products are heterogeneous compute chip solutions that target a wide array of network, compute, and machine learning applications, to address the ever-growing needs of modern datacenters. Xilinx Versal SoCs are highly integrated devices. They feature ARM-based cores for traditional processing, but also integrate a myriad of additional technologies…
Read MoreSan Jose-based adaptable chip manufacturer Xilinx announced an update to its line-up of SmartNICs today, that leverages and array of new technologies. You can think of SmartNICs as intelligent Network Interface Cards that are targeted at the ever-changing and scaling needs of cloud data centers and high-performance compute applications.
Read MoreQualcomm announced its Snapdragon XR2 5G back in December of last year, the second-gen 5G-enabled iteration of the company’s mobile XR platform. For the uninitiated, XR is a term Qualcomm has coined to encompass technologies employed for “Extend Reality” experiences, an amalgam of virtual reality, augmented reality, and mixed reality. Today the company is taking things a step further and announcing a new XR2 5G-based reference design…
Read MoreShares of Silicon Valley machine learning and gaming chip giant NVIDIA have spiked significantly in early morning trading, as the company posted stellar fiscal Q4 results for the period ending in January 2020, after the bell yesterday. Soundly thrashing analyst estimates with $3.11 billion in sales and $1.89 EPS against a $2.97 billion and $1.67 EPS consensus…
Read MoreThe Cloud Gaming services battle is shaping-up to be big business across the globe in 2020, with primary competitors like Google Stadia, Microsoft xCloud, and NVIDIA with GeForce NOW ramping up their respective services and technologies.
Read MoreDell’s Alienware gaming products division stole the spotlight for many tech enthusiasts at this year’s CES 2020 show in Las Vegas, with its Concept UFO handheld PC gaming device. The product is currently just a prototype of a device that Dell-Alienware could launch some day, so there was no committed ship date given, nor pricing details of any kind.
Read MoreCurrently, there are two dominant platforms employed in the majority of notebooks -- Intel’s and AMD’s -- though an array of fledgling Arm-based platforms are making in-roads as well. Regardless, the vast majority of mainstream or enterprise consumers, however, will likely be deciding between Intel- or AMD-based notebooks, when it comes times to make a purchase. In this paper, we aim to determine how the current two dominant notebook platforms perform in an array of scenarios, using an assortment of systems, including Dell’s Latitude 5400 and 7400 series laptops.
Read MoreEven if you’re not a tech analyst, enthusiast or hyper-connected geek that follows the industry, it was obvious that we’ve achieved major technology milestones in 2019, and the pace of innovation is now accelerating dramatically and in some areas exponentially. Artificial Intelligence (AI) was perhaps the hallmark of technological advancement this year, with its amazing potential that’s both exciting and perhaps even frightening to some.
Read MoreEarlier this month, Qualcomm unveiled its latest mobile processing engine for next generation 5G smartphones, known as the Snapdragon 865 5G Mobile Platform. The company refers to Snapdragon 865 as a “mobile platform” solution because indeed, in conjunction with its Snapdragon X55 modem, it provides a comprehensive flagship 5G smartphone silicon solution…
Read MoreBattery life is often one of, if not the, most important consideration for general consumers and IT departments when contemplating the purchase of a new notebook (or fleet of notebooks). Having the ability to enjoy a full PC experience or be productive while mobile and untethered from an electrical outlet for hours on end has transformed how – and where -- we do business and entertain ourselves in recent years. For many road warriors, longer battery life directly translates into increased productivity, so maximum untethered uptime is paramount.
Read MoreA prevailing — but incorrect — notion is that you need at least a $650 GeForce RTX 2080 card to get good performance with ray tracing enabled. That’s only true perhaps, if you’re the type that needs to game at ultra-high resolutions with absolute max image quality settings. However, at a FHD 1080p resolution, which is what the vast majority of mainstream gamers use, you may be surprised to learn that the lowest cost GeForce RTX card – currently the GeForce RTX 2060 — can get the job done quite well…
Read MoreSo what gives Adobe, Slack, Dropbox? And heck, while we’re at it, let’s throw some very popular mainstream, graphically-efficient game titles like Fortnite into the mix as well. The writing is on the wall with respect to Windows on Snapdragon and its always-connected, highly power-efficient advantages.
Read MoreA few months back, I wrote about the MLPerf consortium and the release of its Inference v0.5 benchmark. MLPerf had previously disclosed some performance results from its Training v0.6 benchmark, but training is only part of the machine learning equation. It is when the training process is complete and weightings have been assigned to the dataset that a neural network can intelligently infer things from that data — this process is what is referred to as inference.
Read MoreFor the past couple of years, Intel has been uncharacteristically vocal about its plans to enter the discrete GPU market, to take on graphics stalwarts AMD and NVIDIA in both consumer PCs and the data center. Historically, entities like Intel have kept details of unreleased, forward-looking products hush-hush until they are much closer to being introduced.
Read MoreTo date, LG U+ has introduced a GeForce NOW trial in Korea, in association with a mobile subscription plan, and Softbank recently kicked-off pre-registrations in Japan for free beta that is slated to launch this winter. And today at the IgroMir Expo, a large video gaming event currently underway in Moscow, SAFMAR Group introduced the GeForce NOW service in Russia.
Read MoreAs part of its evolving strategy, Silicon Valley bellwether Xilinx has been integrating this adaptive technology into platform accelerator solutions for machine learning, as well as domain specific architecture solutions that incorporate various compute resources like ARM cores, high speed IO and even RF functions.
Read MoreFor some time now, Qualcomm has been evangelizing the idea that a next generation 5G wireless rollout will arrive significantly faster than previous generation 4G networks did back in the day. Today, putting its money where its mouth is, the company has made a number of announcements at the IFA show in Berlin that not only underscore the depth and breadth of its commitment to 5G, but also help shape the mainstream 5G landscape…
Read MoreWith the rapid proliferation of wirelessly connected smart devices, the need for more advanced Wi-Fi networks that offer greater capacity, reliability, range, and performance has never been greater. To that end, a number of players have announced bleeding edge 802.11ax, or Wi-Fi 6, chipsets for various networking applications, including Qualcomm, which just unveiled an extensive array of Pro Series platforms that target a number of different performance segments.
Read MoreIf there was any doubt in your mind that real-time ray tracing was the wave of the future for cutting-edge gaming and 3D graphics, one of the most popular game titles of all time, Minecraft, just got a full scene RT makeover and it looks gloriously good. Mojang, Microsoft and NVIDIA just announced in a Gamescom 2019 unveil that full-scene, path-based ray tracing will be made available in Minecraft
Read MoreGoogle open-sourced BERT so that others could train their own conversational question answering systems. And today, NVIDIA announced that its AI compute platform was the first to train BERT in less than an hour and complete AI inference in just over 2 milliseconds.
Read More