We Hope to See You at Intel® FPGA Technology Day 2021
Intel® FPGA Technology Day (IFTD) is a free four-day event that will be hosted virtually across the globe in North America, China, Japan, EMEA, and Asia Pacific from December 6-9, 2021. The theme of IFTD 2021 is “Accelerating a Smart and Connected World.” This virtual event will showcase Intel® FPGAs, SmartNICs, and infrastructure processing units (IPUs) through webinars and demonstrations presented by Intel experts and partners. The sessions are designed to be of value for a wide range of audiences, including Technology Managers, Product Managers, Board Designers, and C-Level Executives. Attendees to this four-day event will learn how Intel’s solutions can solve the toughest design challenges and provide the flexibility to adapt to the needs of today’s rapidly evolving markets. A full schedule of Cloud, Networking, Embedded, and Product Technology sessions, each just 30 minutes long, will enable you to build the best agenda for your needs. Day 1 (December 6), TECHNOLOGY: FPGAs for a Dynamic Data Centric World: Advances in cloud infrastructure, networking, and computing at the edge are accelerating. Flexibility is key to keeping pace with this transforming world. Learn about innovations developed and launched in 2021 along with new Intel FPGA technologies that address key market transitions. Day 2 (December 7), CLOUD AND ENTERPRISE: Data Center Acceleration: The cloud is changing. Disaggregation improves data center performance and scalability but requires new tools to keep things optimized. Intel FPGA smart infrastructure enables smarter applications to make the internet go fast! Day 3 (December 8): EMBEDDED: Transformation at the Edge: As performance and latency continue to dictate compute’s migration to the edge, Intel FPGAs provide the workload consolidation and optimization required with software defined solutions enabled by a vast and growing partner ecosystem. Day 4 (December 9): NETWORKING: 5G – The Need for End-to-End Programmability: The evolution of 5G continues to push the performance-to-power envelop, requiring market leaders to adapt or be replaced. Solutions for 5G and beyond will require scalable and programmable portfolios to meet evolving standards and use cases. To explore the detailed program, see the featured speakers, and register for the North America event, Click Here. Register in other regions below: EMEA China Japan Asia Pacific2.8KViews0likes0CommentsMegh Computing demos advanced, scalable Video Analytics Solution portfolio at WWT’s Advanced Technology Center
Megh Computing’s Video Analytics Solution (VAS) portfolio implements a flexible and scalable video analytics pipeline consisting of the following elements: Video Ingestion Video Transformation Object Detection and Inference Video Analytics Visualization Because Megh’s VAS is scalable, it can handle real-time video streams from a few to more than 150 video cameras. Because it’s flexible, you can use the VAS pipeline elements to construct a wide range of video analytics applications such as: Factory floor monitoring to ensure that unauthorized visitors and employees avoid hazardous or secure areas Industrial monitoring to ensure that production output is up to specifications Smart City highway monitoring to detect vehicle collisions and other public incidents Retail foot-traffic monitoring to aid in kiosk, endcap, and product positioning, and other merchandising activities Museum and gallery exhibit monitoring to ensure that safe distances are maintained between visitors and exhibits Because the number of cameras can be scaled to well more than 100 when using the VAS portfolio, Megh clearly needed a foundational technology portfolio that would support the solution’s demanding video throughput, computing, and scalability requirements. Megh selected a broad, integrated Intel technology portfolio that includes the latest 3 rd Generation Intel® Xeon® Scalable processors, Intel® Stratix® FPGAs, Intel® Core™ processors, and the Intel® Distribution of the OpenVINO™ toolkit. Megh also chose WWT’s Advanced Technology Center (ATC), a collaborative ecosystem for designing, building, educating, demonstrating, and deploying innovative technology products and integrated architectural solutions for WWT customers, partners, and employees to demo the capabilities of the VAS. WWT built and is hosting a Megh VAS environment within its ATC that allows WWT and Megh customers to explore this solution in a variety of use cases, including specific customer environment needs and other requirements. For more information about the Megh VAS portfolio and the WWT ATC, check out the WWT blog here. Notices and Disclaimers Intel is committed to respecting human rights and avoiding complicity in human rights abuses. See Intel’s Global Human Rights Principles. Intel’s products and software are intended only to be used in applications that do not cause or contribute to a violation of an internationally recognized human right. Intel does not control or audit third-party data. You should consult other sources to evaluate accuracy. Intel technologies may require enabled hardware, software, or service activation. No product or component can be absolutely secure. Your costs and results may vary. © Intel Corporation. Intel, the Intel logo, and other Intel marks are trademarks of Intel Corporation or its subsidiaries. Other names and brands may be claimed as the property of others.1.9KViews0likes0CommentsTerasic DE10-Agilex Accelerator PCIe board combines Intel® Agilex™ F-Series FPGA with four DDR4 SO-DIMM SDRAM sockets and two QSFP-DD connectors
If you’re itching to get your hands on the innovative features built into the new family of Intel® Agilex™ FPGAs like the second-generation Intel® HyperFlex™ architecture or the improved DSP capabilities including half-precision floating point (FP16) and BFLOAT 16 computational abilities, then consider the new Terasic DE10-Agilex Accelerator board. This PCIe card combines an Intel Agilex F-Series FPGA with four independent DDR4 SO-DIMM SDRAM sockets and two QSFP-DD connectors on a three-quarter length PCIe board. The board’s host interface is a PCIe Gen 4.0 x16 port. Each SO-DIMM memory socket accommodates 8 or 16 Gbytes of DDR4 memory, for a maximum total SDRAM capacity of 64 Gbytes, and each QSFP-DD connector accommodates Ethernet transceiver modules to 200G. The board is available with two different cooling options: a 2-slot version with integrated fans or a single-slot, passively cooled version. The Terasic DE10-Agilex Accelerator PCIe card combines an Intel® Agilex™ F-Series FPGA with four independent DDR4 SO-DIMM SDRAM sockets and two QSFP-DD connectors The Terasic DE10-Agilex PCIe board supports the Intel® OpenVINO™ toolkit, OpenCL™ BSP, and Intel® oneAPI Toolkits used for developing code for myriad high-performance workloads including computer vision and deep learning. The Intel Agilex FPGA family delivers up to 40% higher performance 1 or up to 40% lower power 1 for data center, NFV and networking, and edge compute applications. For more technical information about the Terasic DE10-Agilex Accelerator Board or to order the product, please contact Terasic directly. Notices and Disclaimers 1 This comparison based on Intel® Agilex™ FPGA and SoC family vs. Intel® Stratix® 10 FPGA using simulation results and is subject to change. This document contains information on products, services and/or processes in development. All information provided here is subject to change without notice. Contact your Intel representative to obtain the latest forecast, schedule, specifications, and roadmaps. Intel technologies may require enabled hardware, software or service activation. No product or component can be absolutely secure. Your costs and results may vary. Intel does not control or audit third-party data. You should consult other sources to evaluate accuracy. © Intel Corporation. Intel, the Intel logo, and other Intel marks are trademarks of Intel Corporation or its subsidiaries. Other names and brands may be claimed as the property of others.1.9KViews0likes0CommentsTech for Good Magazine article describes how Project: CORaiL is helping to save coral reefs
The August 2020 issue of Tech For Good, a publication dedicated highlighting how new technologies are bringing positive change to the world around us, chose Project: CORaiL as its cover story. This project – jointly launched by Accenture, the non-profit Sulubaaï Environmental Foundation, and Intel – aims at developing an innovative solution for recreating and restoring coral reefs to their former health. In 2016, the Sulubaaï Environmental Foundation created a marine-protected area around Pangatalan Island in the Philippines with the aim of restoring the coral reefs around the island to health. The reefs had been badly damaged by dynamite fishing. To that end, the Sulubaaï Environmental Foundation developed the Sulu-Reef Prosthesis, an underwater, concrete scaffold that supports coral regrowth and the foundation needed to evaluate the effectiveness of this new scaffold design. That need prompted the creation of Project: CORaiL and the Tech For Good article, written by Daniel Brigham and titled “How to Save the Coral Reef,” describes how the project came to be. Initially, the effectiveness of the Sulu-Reef Prosthesis scaffolding was judged by analyzing individual photographs taken by scuba divers, a slow manual process. That manual process did not produce nearly enough data and the divers inadvertently disturbed the environment in the reef. That’s when Accenture and Intel became involved. The article explains how Ewen Plougastel, Managing Director at Accenture Applied Intelligence, became interested in the Sulubaaï Environmental Foundation’s reef project and decided to combine his passion for scuba diving and for the marine ecosystem with his day job. The Accenture team approached the Sulubaaï Environmental Foundation, and together with Intel, they launched Project: CORaiL. The project included the development of an intelligent, underwater camera that incorporates Accenture’s Video Analytics Services Platform (VASP), which provides a toolset for rapidly building and deploying AI capabilities. Accenture’s VASP provides multiple powerful analytics and visualization tools to help analysts increase operational insight and to make timely decisions from computer vision models. The Tech For Good article quotes Patrick Dorsey, Vice President and Data Platforms Group General Manager for FPGA and Power Products at Intel, who said, “We looked at the problem CORaiL was trying to solve, and we discovered if we took multiple technologies from the Intel platform we could effectively solve their problems.” Project: CORaiL is powered by multiple Intel technologies including Intel® Xeon® CPUs, Intel® FPGA Programmable Acceleration Cards, and the Intel® Neural Compute Stick 2 powered by the Intel® Movidius™ Myriad™ X Vision Processing Unit (VPU) and the Intel® Distribution of OpenVINO™ toolkit. According to the Tech for Good article, Project: CORaiL has collected 40,000 images so far, allowing researchers “to gauge the health of the coral reef in real time.” Be sure to read “How to Save the Coral Reef” in the latest issue of Tech For Good for more details. For more information about Project: CORaiL, see “Accenture, the Sulubaaï Environmental Foundation, and Intel partner to create the CORaiL underwater vision system to help restore fragile coral reef ecosystems.” Intel is committed to respecting human rights and avoiding complicity in human rights abuses. See Intel’s Global Human Rights Principles. Intel’s products and software are intended only to be used in applications that do not cause or contribute to a violation of an internationally recognized human right. Intel’s silicon and software portfolio empowers our customers’ intelligent services from the cloud to the edge. Notices & Disclaimers Intel does not control or audit third-party data. You should consult other sources to evaluate accuracy. Intel technologies may require enabled hardware, software or service activation. No product or component can be absolutely secure. Your costs and results may vary. © Intel Corporation. Intel, the Intel logo, and other Intel marks are trademarks of Intel Corporation or its subsidiaries. Other names and brands may be claimed as the property of others.1.3KViews0likes0CommentsFinancial Times article discusses Sulubaaï Environmental Foundation’s efforts to save coral reefs using smart underwater cameras and AI
Last week, the Financial Times published an article by Adam Green titled “Tech knowhow gives new lease of life to marine habitats.” The article discusses various efforts underway to combat the demise of sensitive marine ecosystems, especially coral reefs, and the first project discussed is the Sulubaaï Environmental Foundation’s efforts to regrow and rebuild Pangatalan Island’s marine protected area in the Philippines. This project, done on conjunction with Accenture and Intel, has developed an innovative solution for recreating and restoring the island’s coral reefs to their former health that includes the addition of AI capabilities to underwater cameras that automatically capture and process tens of thousands of images of fish and other marine species to identify and monitor their migration patterns and their daily life in the reef. The Financial Times article quotes Patrick Dorsey, a Vice President in the Intel Programmable Solutions Group, who discusses the reasoning behind the creation of the automated camera system, which is both more accurate and less disruptive than human divers when used to catalog ongoing changes to the reef’s marine life. The smart underwater video cameras employ Accenture’s Video Analytics Services Platform (VASP), which provides a toolset for rapidly building and deploying AI capabilities. Accenture’s smart underwater video cameras located near the concrete scaffolds employ Accenture’s VASP, which is powered by multiple Intel technologies including Intel® Xeon® CPUs, Intel® FPGA Programmable Acceleration Cards (PACs), and the Intel® Neural Compute Stick 2 powered by the Intel® Movidius™ Myriad™ X Vision Processing Unit (VPU) and the Intel® Distribution of OpenVINO™ toolkit. For more information about the technology behind this project, see “Accenture, the Sulubaaï Environmental Foundation, and Intel partner to create the CORaiL underwater vision system to help restore fragile coral reef ecosystems.” Notices and Disclaimers Intel technologies may require enabled hardware, software or service activation. No product or component can be absolutely secure. Your costs and results may vary. Intel does not control or audit third-party data. You should consult other sources to evaluate accuracy. © Intel Corporation. Intel, the Intel logo, and other Intel marks are trademarks of Intel Corporation or its subsidiaries. Other names and brands may be claimed as the property of others.1.3KViews0likes0CommentsThe iAbra PathWorks toolkit brings embedded AI inference, real-time video recognition to the edge with Intel® Arria® 10 FPGAs, Intel® Xeon® Gold CPUs, and Intel® Atom® processors
It’s not always easy to get data to the cloud. Multi-stream computer vision applications, for example, are extremely data intensive and can overwhelm even 5G networks. A company named iAbra has created tools that build neural networks that run on FPGAs in real time, so that inference can be carried out at the edge in small, light, low-power embedded devices rather than in the cloud. Using what-you-see-is-what-you-get (WYSIWYG) tools, iAbra’s PathWorks toolkit creates neural networks that run on an Intel® Atom® x7-E3950 processor and an Intel® Arria® 10 FPGA in the embedded platform. The tools themselves run on an Intel® Xeon® Gold 6148 CPU to create the neural networks. From a live video stream, artificial intelligence (AI) can detect, for example, how many people are standing in a bus queue, which modes of transport people are using, and where there is flooding or road damage. In exceptional circumstances, AI can also alert emergency services if vehicles are driving against the traffic flow or if pedestrians have suddenly started running. Collecting reliable, real-time data from the streets and compressing it through AI inference makes it far easier to manage resources and to improve quality of life, productivity, and emergency response times in Smart Cities. To be effective, these vision applications must process a huge amount of data in real time. A single HD stream generates 800 to 900 megabits of streaming video data per second. That’s per camera. Although broadband 5G networks deliver more bandwidth and can greatly increase the device density within geographic regions, broadly and densely distributed armadas of video cameras still risk overwhelming these networks. The solution to this bandwidth constraint is to incorporate real-time AI inference at the network edge so that only the processed, essential information is sent to the cloud. That sort of processing requires an embedded AI device that can withstand the harsh environments and resource constraints found on the edge. iAbra has approached the problem of building AI inference into embedded devices by mimicking the human brain using FPGAs. Usually, image recognition solutions map problems to generic neural networks, such as ResNet. However, such networks are too big to fit into many FPGAs destined for embedded use. Instead, iAbra’s PathWorks toolkit constructs a new, unique neural network for each problem, which is tailored and highly optimized for the target FPGA architecture where it will run. In this case, the target architecture is an Intel Arria 10 FPGA. “We believe the Intel Arria 10 FPGA is the most efficient part for this application today, based on our assessment of the performance per watt,” said iAbra’s CTO Greg Compton. “The embedded platform also incorporates the latest generation Intel Atom processor, which provides a number of additional instructions for matrix processing over the previous generation. That makes it easier to do vector processing tasks. When we need to process the output from the neural network, we can do it faster with instructions that are better attuned to the application,” Compton explains. He adds: “A lot of our customers are not from the embedded world. By using Intel Atom processors, we enable them to work within the tried and tested Intel® architecture stack they know.” Similarly, said Compton: “We chose the Intel Xeon Gold 6148 processor for the network creation step as much for economics as performance.” iAbra developed this solution using OpenCL, a programming framework that makes FPGA programming more accessible by using a language similar to C, enabling code portability across different types of processing devices. iAbra also uses Intel® Quartus® Prime Software for FPGA design and development and the Intel® C++ Compiler to develop software. The company has incorporated Intel® Math Kernel Library (Intel® MKL), which provides optimized code for mathematical operations across a range of processing platforms. Compton continues: “With Intel MKL, Intel provides highly optimized shortcuts to a lot of low-level optimizations that really help our programmer productivity. OpenCL is an intermediate language that enables us to go from the high level WYSIWYG world to the low-level transistor bitmap world of FPGAs. We need shortcuts like these to reduce the problem domains, otherwise developing software like ours would be too big a problem for any one organization to tackle.” iAbra participates in the Intel FPGA Partner Program and Intel® AI Builders Program, which gives the company access to the Intel® AI DevCloud. “The Intel® AI DevCloud enables us to get cloud access to the very latest hardware, which may be difficult to get hold of, such as some highly specialized Intel® Stratix® 10 FPGA boards. It gives us a place where Intel customers can come and see our framework in a controlled environment, enabling them to try before they buy. It helped us with our outreach for a Smart Cities project recently. It’s been a huge help to have Intel’s support as we refine our solution, and develop our code using Intel’s frameworks and libraries. We’ve worked closely with the Intel engineers, including helping them to improve the OpenCL compiler by providing feedback as one of its advanced users,” Compton concludes. For more information about the iAbra Pathworks toolkit, please see the new Case Study titled “Bringing AI Inference to the Edge.” Intel’s silicon and software portfolio empowers our customers’ intelligent services from the cloud to the edge. Notices & Disclaimers Software and workloads used in performance tests may have been optimized for performance only on Intel microprocessors. Performance tests, such as SYSmark and MobileMark, are measured using specific computer systems, components, software, operations and functions. Any change to any of those factors may cause the results to vary. You should consult other information and performance tests to assist you in fully evaluating your contemplated purchases, including the performance of that product when combined with other products. For more complete information visit www.intel.com/benchmarks. Performance results are based on testing as of dates shown in configurations and may not reflect all publicly available updates. See backup for configuration details. No product or component can be absolutely secure. Your costs and results may vary. Intel technologies may require enabled hardware, software or service activation. © Intel Corporation. Intel, the Intel logo, and other Intel marks are trademarks of Intel Corporation or its subsidiaries. Other names and brands may be claimed as the property of others. 1.5KViews0likes0CommentsAccenture, the Sulubaaï Environmental Foundation, and Intel partner to create the CORaiL underwater vision system to help restore fragile coral reef ecosystems
Accenture has partnered with the Sulubaaï Environmental Foundation and Intel to develop an innovative solution for recreating and restoring coral reefs to their former health by installing special equipment including underwater cameras in the Pangatalan Island marine protected area in the Philippines. This system supports the Sulubaaï Environmental Foundation’s efforts to regrow and rebuild the island’s coral reef ecosystem. Coral reefs are one of the most diverse ecosystems on planet Earth and they’re threatened by a host of challenges including overfishing, deteriorating water quality, and ocean pollution. The project is called CORaiL, where the “ai” stands for “artificial intelligence.” The engineering solution combines mobile, digital, and deep learning technologies. By employing rapid-prototyping techniques, the engineering team quickly developed an edge computing solution to monitor the progress of the restoration efforts by observing, classifying, and measuring marine life activity in the coral reef. The CORaiL testbed was initially launched in May, 2019. Coral reef restoration efforts in the Pangatalan marine protected area involve the installation of concrete structures on the sea bottom. These structures serve as scaffolds that anchor new fragments of living coral. The CORaiL system studies these living coral fragments as they grow over time. As the coral reef grows, it creates a larger and larger haven for marine life. Consequently, the CORaiL system also monitors the presence of fish. An artificial, concrete reef to provide support for unstable coral fragments underwater is implemented by Accenture, Intel and Sulubaaï Environmental Foundation in the coral reef surrounding the Pangatalan Island in the Philippines. Photo Credit: Accenture Smart underwater video cameras located near the concrete scaffolds employ Accenture’s Video Analytics Services Platform (VASP), which provides a toolset for rapidly building and deploying AI capabilities. Accenture’s VASP provides multiple powerful analytics and visualization tools to help analysts increase operational insight and timely decisions from computer vision models. VASP is powered by multiple Intel technologies including Intel® Xeon® CPUs, Intel® FPGA Programmable Acceleration Cards (PACs), and the Intel® Neural Compute Stick 2 powered by the Intel® Movidius™ Myriad™ X Vision Processing Unit (VPU) and the Intel® Distribution of OpenVINO™ toolkit. The resulting platform provides an effective, non-invasive observation platform for ongoing observations. The underwater cameras detect and photograph fish as they swim past. Deep learning algorithms built into the cameras then count and classify the fish. The processed data then travels wirelessly to the ocean surface and then to onshore servers where it’s processed, analyzed, and stored. The resulting data and reports allow the research team to make data-driven decisions about the restoration work. Accenture hopes to use this system for other marine applications including fish migration studies and intrusion-detection and -monitoring of restricted underwater areas such as marine sanctuaries and fish farms. For more information, see “Using Artificial Intelligence to Save Coral Reefs.” For more information about the Accenture VASP system, click here. Notices and Disclaimers Intel technologies may require enabled hardware, software or service activation. No product or component can be absolutely secure. Your costs and results may vary. Intel does not control or audit third-party data. You should consult other sources to evaluate accuracy. © Intel Corporation. Intel, the Intel logo, and other Intel marks are trademarks of Intel Corporation or its subsidiaries. Other names and brands may be claimed as the property of others.1.6KViews0likes0CommentsSUB2r customizable video camera gets its video pipeline programmability and configurability from an FPGA powered by four Intel® Enpirion® Power Systems on Chip (PowerSoC) modules
SUB2r, a self-funded startup makes a video camera for storytellers and gamers who want to create compelling video content. The company had a problem: visible current noise was marring the images coming from its eponymous video camera. The SUB2r camera is really a video computing platform with a configurable, customizable, upgradeable, programmable, open-architecture imaging pipeline based on an FPGA-based implementation. The camera derives its operating power from the attached USB 3.0 cable, which limits the camera’s power supply input to 5 volts at 3 amps. The camera’s original, internal voltage down-converter design was injecting a significant amount of noise into the circuitry, which resulted in a noisy image. The SUB2r customizable video camera gets its video pipeline programmability and configurability from an FPGA that’s powered by four Intel® Enpirion® Power Systems on Chip (PowerSoC) modules. SUB2r called in an expert power-conversion team from Intel for help. The team helped SUB2r redesign the camera’s on-board voltage regulation using four Intel® Enpirion® Power System on a Chip (PowerSoC) devices: the EN5319QI, EN5329QI, EN5339QI, and EN6340QI PowerSoCs. These four Intel Enpirion PowerSoC modules generate the four on-board power supplies required by the FPGA: 1 volt at 3 amps 5 volts at 3 amps 8 volts at 1 amp 5 volts at 2 amps After the power supply was redesigned using the Intel Enpirion PowerSoCs, visible current noise in the camera’s video output stream became undetectable. As a bonus, current consumption drawn over the USB 3.0 power/data cable dropped from 3 amps to 1.6 amps and the passively cooled camera’s internal operating temperature dropped from 58° C to 41° C due to the improved power supply efficiency. The reduced internal operating temperature should improve camera reliability, as it would in any electronic system – like yours. The design team at SUB2r was so impressed by the overall result that they made a short “thank you” video for the Intel Enpirion team that helped with the redesign. You’ll find that video here. If you are facing tough power-conversion challenges, think about the results that SUB2r achieved and then consider giving the Intel Enpirion team a call. They’re here to help. Legal Notices and Disclaimers: Intel technologies’ features and benefits depend on system configuration and may require enabled hardware, software or service activation. Performance varies depending on system configuration. No product or component can be absolutely secure. Check with your system manufacturer or retailer or learn more at intel.com. Results have been estimated or simulated using internal Intel analysis, architecture simulation and modeling, and provided to you for informational purposes. Any differences in your system hardware, software or configuration may affect your actual performance. Intel does not control or audit third-party data. You should review this content, consult other sources, and confirm whether referenced data are accurate. Cost reduction scenarios described are intended as examples of how a given Intel- based product, in the specified circumstances and configurations, may affect future costs and provide cost savings. Circumstances will vary. Intel does not guarantee any costs or cost reduction. © Intel Corporation. Intel, the Intel logo, and other Intel marks are trademarks of Intel Corporation or its subsidiaries. Altera is a trademark of Intel Corporation or its subsidiaries. Cyclone is a trademark of Intel Corporation or its subsidiaries. Intel and Enpirion are trademarks of Intel Corporation or its subsidiaries. Other names and brands may be claimed as the property of others.394Views0likes0CommentsSee the explosive intersection of Intel® FPGAs, Intel® CPUs, and ProAV digital video in action at ISE 2020 next month in Amsterdam
Now that ProAV has gone completely digital and must deal with high-quality, high bandwidth, low-latency, uncompressed and compressed video workloads, Intel’s corporate, data-centric mantra clearly applies: Move data faster, efficiently store and access data, and process everything You’ll be able to see this mantra in action in the Intel booth (#8-C210) at next month’s Integrated Systems Europe (ISE) 2020 show in Amsterdam. ISE is the world’s largest AV and systems integration tradeshow with 1300 exhibitors, many thousands of products on display, and 80,000 attendees from 190 countries. The Intel booth will contain several significant demos including: A real-time, video-based, traffic-tracking demo by Corerain that collects statistics on the people observed in the Intel booth at the show. The demo determines the number of people in the booth and the average time spent per person in each booth zone. There are obvious applications for this technology in both the retail environment and for security applications. This demo showcases the Corerain CAISA AI inference engine (for more information, see “Corerain’s CAISA stream engine transforms FPGA into Deep Learning Neural Network without HDL coding”) running on an IEI TANK-AIoT Dev Kit, which includes a IEI fanless industrial PC based on an Intel® Core™ i5 processor and a pre-installed copy the Intel® Distribution of OpenVINO® toolkit, developed specifically to aid in the development of vision-based solutions and compatible with a range of Intel® CPUs, Intel® GPUs, Intel® FPGAs, and the Intel® Movidius™ Neural Compute Stick. An Inspur PAC A10 programmable accelerator card – which is based on an Intel® Arria® 10 FPGA – is plugged into one of the PC’s PCIe slots. It provides a real-time acceleration platform for the Corerain CAISA AI inference engine. A highly responsive, large-format, 46-inch interactive flat panel display (IFPD) based on SigmaSense® technology, which provides superior touch performance compared to existing solutions. This display demonstrates full concurrency for self-capacitive, mutual-capacitive sensing. The company’s SigmaDrive™ concurrent drive and sense technology provides ultra-low-latency and achieves industry-leading signal-to-noise (SNR) by implementing real-time computational functions with five Intel® Cyclone® V SoCs. The demo’s IFPD communicates with a host PC based on an Intel® Core processor. The PC runs a variety of touch-enabled applications that integrate SigmaVision™ capacitive imaging functionality. The interactive SigmaSense touch technology accommodates screen sizes up to and beyond 100 inches (diagonally) and reduces the time needed for sensor tuning – an often tedious task that can require weeks of engineering work – to minutes. This touch technology works through water, gloves, or thick glass and is well suited for interactive tabletops and outdoor and retail interactive digital signage. An Ibase SP-63E 8K/12K Digital Signage Player that uses an Intel Core processor and three Intel Arria 10 FPGAs to display video from a media player based on an Intel® NUC Mini PC on a large mosaic of HD displays. Together, the Intel Core processor and the three Intel Arria 10 FPGAs process the video, scale it, slice it, and distribute it in real time to a large video wall constructed with twelve 1080p60 HDMI display panels. Large video walls are well suited to retail, enterprise, and educational environments and anywhere else that needs large, attention-getting digital signage. ISE 2020 takes place in Amsterdam on February 11-14. Legal Notices and Disclaimers: Intel technologies’ features and benefits depend on system configuration and may require enabled hardware, software or service activation. Performance varies depending on system configuration. No product or component can be absolutely secure. Check with your system manufacturer or retailer or learn more at intel.com. Results have been estimated or simulated using internal Intel analysis, architecture simulation and modeling, and provided to you for informational purposes. Any differences in your system hardware, software or configuration may affect your actual performance. Intel does not control or audit third-party data. You should review this content, consult other sources, and confirm whether referenced data are accurate. Cost reduction scenarios described are intended as examples of how a given Intel- based product, in the specified circumstances and configurations, may affect future costs and provide cost savings. Circumstances will vary. Intel does not guarantee any costs or cost reduction. © Intel Corporation. Intel, the Intel logo, and other Intel marks are trademarks of Intel Corporation or its subsidiaries. Altera is a trademark of Intel Corporation or its subsidiaries. Cyclone is a trademark of Intel Corporation or its subsidiaries. Other names and brands may be claimed as the property of others.748Views0likes0Comments