Generative AI Archives - AiThority https://aithority.com/tag/generative-ai/ Artificial Intelligence | News | Insights | AiThority Tue, 09 Jan 2024 09:51:00 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.2 https://aithority.com/wp-content/uploads/2023/09/cropped-0-2951_aithority-logo-hd-png-download-removebg-preview-32x32.png Generative AI Archives - AiThority https://aithority.com/tag/generative-ai/ 32 32 Lenovo Unleashes AI-Powered Creativity and Productivity Devices and Solutions at CES 2024 https://aithority.com/machine-learning/lenovo-unleashes-ai-powered-creativity-and-productivity-devices-and-solutions-at-ces-2024/ Tue, 09 Jan 2024 09:46:45 +0000 https://aithority.com/?p=556294 Lenovo Unleashes AI-Powered Creativity and Productivity Devices and Solutions at CES 2024

Lenovo strengthens its portfolio and vision of AI for All with AI PCs and other powerful innovations Lenovo unveiled a full lineup of more than 40 new devices and solutions powered by AI, furthering the company’s vision of AI for All. The announcements include new AI PC innovations across Lenovo’s Yoga, ThinkBook, ThinkPad, ThinkCentre, and Legion […]

The post Lenovo Unleashes AI-Powered Creativity and Productivity Devices and Solutions at CES 2024 appeared first on AiThority.

]]>
Lenovo Unleashes AI-Powered Creativity and Productivity Devices and Solutions at CES 2024

Lenovo strengthens its portfolio and vision of AI for All with AI PCs and other powerful innovations

Lenovo unveiled a full lineup of more than 40 new devices and solutions powered by AI, furthering the company’s vision of AI for All. The announcements include new AI PC innovations across Lenovo’s Yoga, ThinkBook, ThinkPad, ThinkCentre, and Legion sub-brands that personalize the computing experience for both consumers and businesses like never before. Two new proof of concept products, a tablet, software app, Motorola AI features, accessories, and more, round out the robust new portfolio of technology solutions.

AIThority Predictions Series 2024 banner

Recommended AI News: Riding on the Generative AI Hype, CDP Needs a New Definition in 2024

“, 8) as of December 2, 2022 of all major PC manufacturers shipping more than one million units worldwide annually. ”

Lenovo Showcases New Repertoire of Consumer Devices that Supercharge the Creative Process with AI

Lenovo showcased its latest repertoire of new consumer devices: A selection of Yoga AI laptops that supercharge the creative process, a tablet that invites users to play and learn, IdeaPad laptops designed for the everyday user, and peripherals for the modern world. Ushering in a new era, Lenovo’s newest lineup of Microsoft Windows 11 Lenovo Yoga laptops come with Lenovo Yoga Creator Zone, an exclusive new software for creators, artists, and anybody looking to harness the power of generative AI in a simple and private way with security in mind. As an imagination sparkplug, Lenovo Yoga Creator Zone offers image generation, which transforms text-based descriptions or even sketches into stunning visuals without complex prompts, codes, or setups. Users simply type whatever they want to see, and the system instantly creates a visual representation.

Headlining the new generation of Lenovo Yoga laptops are the Lenovo Yoga Pro 9i (16”, 9) and the Lenovo Yoga 9i 2-in-1 (14”, 9), the latter of which comes bundled with a Lenovo Smart Pen and Sleeve. Designed for creators with uncompromising standards, both premium laptops are MIL-STD-810H rated for durability and are packed with cutting-edge components, including the latest Intel Core Ultra Processors for blazing-fast performance; a physical Lenovo AI Core Chip that powers robust AI functionality; and powerful batteries for nonstop creativity. Maximizing the performance of the Lenovo AI Core and Intel Ultra processors in the Lenovo Yoga Pro 9i (16”, 9) is Lenovo X Power, a tuning solution powered by machine learning that enhances creative tasks such as 3D rendering and film color correction by allocating processing power to optimize performance, battery life and cooling efficiency. Both models feature a Copilot key, providing faster access to one’s everyday AI companion. Copilot in Windows 11 harnesses the capabilities of AI to deliver relevant answers, summarize emails, generate images, and more.

Rounding out the Lenovo Yoga family are four additional new laptops. For portability, the Lenovo Yoga Slim 7i (14”, 9) is a thin and light Intel Evo edition premium laptop powered by Intel Core Ultra Processors and a WUXGA OLED screen. For content creation, the Lenovo Yoga Pro 7i (14”,9)​ and Lenovo Yoga Pro 7 (14”,9) are creator laptops equipped with an option of Intel Core Ultra Processors or up to an AMD Ryzen™ 7 8845HS processor, up to NVIDIA GeForce RTX 4050 Laptop GPU with NVIDIA Studio validation, and a PureSight Pro LCD or OLED 3K screen.

Recommended AI News: WiMi Developed RPSSC Technology With Multiple Advantages in Hyperspectral Image Processing

For the wow factor, the Lenovo Yoga Book 9i (13″, 9)​, whose predecessor was the world’s first full size dual-screen OLED laptop, can now be powered by Intel’s new Core Ultra Processors, a PureSight OLED 2.8K screen, and a rotating Bowers & Wilkins soundbar. It can also be equipped with a variety of creativity enhancing software, including Smart Launcher to group commonly used apps to improve efficiency; Enhanced Virtual Keyboard for users to express their personality with skins; AI to beautify handwriting, and more. Last but not least are the new Lenovo Yoga 7i 2-in-1 (16”, 9) and Lenovo Yoga 7i 2-in-1 (14”, 9) convertible laptops that give creators on the go quick access to tools that complement their creativity.

Made for entertainment but designed for learning, the Lenovo Tab M11 tablet is the go-to “me time” device for students, movie connoisseurs, and doodling artists. Equipped with the Lenovo Tab Pen, a handy stylus that delivers an uncompromised writing, drawing and scribbling experience, the tablet comes preloaded with premium software that enhances usability: Nebo to convert handwriting into text, MyScript Calculator 2 to solve equations and functions in real-time, and WPS Office to easily view and edit documents.

All New AI PC Lenovo ThinkBook Laptops and ThinkCentre neo Desktops Inspire a New Wave of Productive and Creative Power

Lenovo also announced new ThinkBook products, ThinkCentre desktops, and accessories for the small and medium sized business (SMB) market, with innovative features, smart designs, and AI PC enhancements. The new ThinkBook Plus Gen 5 Hybrid is a flexible hybrid solution with a laptop base system and a tablet that can work independently or together and seamlessly switch between laptop and tablet. The solution offers a unique innovation providing a Hybrid experience between Windows and Android system. The ThinkBook 13x Gen 4 is a beautiful and powerful Intel Evo Edition laptop with fantastic battery life, a built-in Copilot key, and is Lenovo’s first carbon-neutral laptop for SMBs4. The ThinkBook 14 i Gen 6+5 is a powerful, versatile, and smart laptop with up to stunning 14.5-inch 3K display and includes a Graphics Extension (TGX) port that supports the new ThinkBook Graphics Extension (TGX)5 dock boosting AI computing power. Lenovo is also introducing its updated ThinkBook 16p Gen 5 combining power and elegance with support for the new Magic Bay Studio that incorporates a 4K cameraand integrated speakers. The ThinkCentre neo Ultra exploits the very latest technology to deliver a new generation of ultra small form factor AI PC, and the ThinkCentre neo 50a Gen 5 all-in-one desktop will be available in 24- and 27-inch form factors. The Lenovo ThinkBook laptops, ThinkCentre neo desktops, and the Magic Bay Studio are the latest products and accessories that showcase Lenovo’s innovation and leadership in the SMB market.

Earlier at Lenovo’s Tech World 2023 innovation event, Lenovo showcased an example of a personal AI assistant solution for AI PCs as part of its AI for All vision. Under a Lenovo AI Now solution umbrella, Lenovo is continuing development of a personalized AI solution designed to enable end user interaction on the keyboard and through natural language. The provisionally named Lenovo AI Now Personal Assistant delivers personalized interactive experiences based on a user’s own on-device knowledge base. The AI Assistant will streamline workflows and enhance collaboration in a more personal and immersive manner. Using natural language, users will be able to check and change common settings such as display or performance, search and summarize emails and documents, create meeting invitations, and merge live camera and avatars during video conferencing. Lenovo AI Now Personal Assistant will begin rollout in the first half of 2024 in China.

In addition, Lenovo is showcasing two unique proof of concept devices at CES. The Mechanical Energy Harvesting Combo is a product that uses mechanical movement and solar irradiation to power a mouse and a keyboard, eliminating the need for external charging. The mouse and the keyboard are ergonomically designed to provide comfort and engagement for the user. The product also supports both Bluetooth® and 2.4G wireless connection modes, ensuring easy connectivity with multiple devices. The proof-of-concept product is an innovative solution that aligns with Lenovo’s commitment to implementing more sustainable practices.

Top AI ML News: The New Version of Smart ID Engine Recognizes Korean 15% Better

Lenovo ThinkBook 13x Gen 4 SPE is a revolutionary and innovative proof of concept that boasts powerful performance, an exquisite appearance, and leads a trend of intelligent color personalization laptop covers. Through Lenovo’s hardware and software algorithm solutions and leveraging E Ink Prism™ technology, users can customize the exterior cover in various patterns, creating a unique notebook appearance. This concept supports up to one thousand different images, allowing users to express their personality and creativity. With the ultra-low power technology from E ink and Lenovo’s system design, the color-changing top cover won’t impact the battery life – even when the system is powered off, the top cover can still keep changing. Lenovo will highlight four design schemes at CES 2024, including two colorful schemes, dynamic clock and multi system interaction, that offer different and unique customer preferences.

Lenovo Unlocks New AI PC Experiences with ThinkPad and IdeaPad Laptops Powered by Intel Core Ultra Processors

Lenovo unveiled new business and consumer laptops designed to unlock new AI experiences and boost productivity, creativity and efficiency. The new Lenovo ThinkPad X1 Carbon, ThinkPad X1 2-in-1, and IdeaPad Pro 5i are Intel Evo laptops powered by the latest Intel Core Ultra processors and Windows 11 that deliver optimal power efficiency, performance, and immersive experiences. Dedicated AI acceleration support will help users embrace new experiences and enhance efficiency in work and play, including capabilities enabled by Copilot in Windows. Whether for business or leisure, these Lenovo laptops are amongst the first that are driving an AI PC revolution that will fundamentally change how people create, collaborate, and interact with PCs. Designed to offer users the most comprehensive PC experiences yet, the new ThinkPad X1 and IdeaPad Pro 5i will help users embrace a new generation of AI computing.

Just like the next wave of business laptops, the Lenovo ThinkVision 27 3D monitor is available now and ready to boost productivity and efficiency. The glasses-free 3D monitor now features an even more intuitive and interactive user interface version of 3D Explorer, which welcomes creators to the 3D realm and can also be used in 2D. Additionally, the monitor now comes with increased software support through proprietary applications, including Design Engine, which eliminates the need for individual plug-ins to provide a true interdimensional hybrid design experience. Users can now design in 2D and visualize in 3D, or use its 2D-to-3D Converter, enabling AI-powered 2D to 3D image, video, and content conversion in real time. With AI, high resolution with high refresh rate 2D content instantly transforms into vivid 3D content with precise spatial reconstruction, regardless of how complex the backgrounds can be, and all without requiring additional power or system upgrades.

The Lenovo Legion Gaming Ecosystem: Helping Gamers Reach Their Impossible

Lenovo’s new gaming ecosystem debuted at CES 2024 with Microsoft Windows 11-based PCs, peripherals, software, and services that deliver on power, thermals, graphics, AI-powered advantages, and the freedom to build the ultimate system to game. The new PC portfolio includes the following new 9th generation Lenovo Legion 16-inch gaming laptops and towers:

  • The Lenovo Legion 7i (16”, 9), Lenovo Legion 5i (16”, 9), and Lenovo Legion Slim 5 (16”, 9) are for gamers who need a laptop that can handle their favorite games as well as their STEM apps.
  • The Lenovo Legion 9i (16”, 9) is for those who will settle for nothing less than the best in their gaming and content creation pursuits.
  • The Lenovo Legion Pro 7i (16”, 9) and Lenovo Legion Pro 5i (16”, 9) are for gamers who need the ultimate in FPS, style, and screen.
  • Lenovo Legion Tower 7i and Legion Tower 5i are for those who need the extra horsepower of a top-tier gaming tower PC.

Also new this year are the Lenovo LOQ 15IRX9, Lenovo LOQ 15IAX9I, Lenovo LOQ 15IAX9, and Lenovo LOQ 15AHP9 laptops and Lenovo LOQ Tower 17IRR9 for gamers beginning their journey up the leaderboards. New accessories, such as the Lenovo Legion M410 Wireless RGB Gaming Mouse and Legion K510 Pro Mini Keyboard, along with a new Lenovo AvatarMaster PC software app round out the upgraded ecosystem.

At the core of this new gaming lineup is the family of Lenovo’s proprietary hardware AI chips—called LA AI chips–and the advantages they bring to both Lenovo Legion and Lenovo LOQ gaming laptops. First introduced last CES, this year’s LA AI chips are mightier than ever, enabling Lenovo Legion and Lenovo LOQ laptops to achieve even higher FPS, increased power efficiency, and more. And with a selection of gaming laptops, towers, monitors, accessories and even the handheld Lenovo Legion Go announced last IFA, the new Lenovo gaming ecosystem lets gamers choose the exact setup they need to achieve gaming greatness and reach their ‘impossible’.

Taking the streaming and collaboration experience to a new level, select Lenovo Legion systems, including the Legion 7i (16”, 9) and Legion 5i (16”, 9) are now equipped with AvatarMaster. A new app powered by AI, AvatarMaster transforms users’ profiles into a 3D digital avatar, with complete customization capabilities from appearance and facial features to clothing and accessories. After creating and customizing their avatars, users can animate and stream a digital version of themselves during video conferences, gaming sessions, and across multiple platforms.

Motorola Introduces new MotoTalk Features with AI to Increase Productivity of Retail Teams

Motorola has introduced new AI features available on MotoTalk, a business productivity platform for PC and mobile devices that allows business customers to create and manage tasks and workdays for their teams. With the new features, field teams can use Image Recognition or Route Planning to optimize their daily activities.

With the ‘Image Recognition’ feature, field teams can identify and count the quantity of any product on shelves, gather information on their prices and access share reports in each store, ensuring merchandising compliance. Leveraging AI, the tool also strategically optimizes commercial routes, maximizing efficiency in terms of distance, time, and sales opportunities. This feature enables the generation of itineraries for different field teams in a few minutes and it also summarizes the employee’s visits and performances in stores, saving workload for MotoTalk users.

Read More about AiThority News: How to Avoid AI Hallucinations From Becoming Costly Liabilities

 [To share your insights with us, please write to sghosh@martechseries.com

The post Lenovo Unleashes AI-Powered Creativity and Productivity Devices and Solutions at CES 2024 appeared first on AiThority.

]]>
The Microsoft AI Odyssey Will Prepare 100,000 Developers in India for the Future https://aithority.com/machine-learning/the-microsoft-ai-odyssey-will-prepare-100000-developers-in-india-for-the-future/ Tue, 09 Jan 2024 08:19:32 +0000 https://aithority.com/?p=556283 The Microsoft AI Odyssey Will Prepare 100,000 Developers in India for the Future

The month-long program has two levels that developers need to complete for a chance to win VIP passes to Microsoft’s upcoming AI Tour Microsoft announced the launch of an initiative that aims to skill 100,000 developers in India in the latest AI technologies and tools. The program offers comprehensive learning experience that helps developers acquire […]

The post The Microsoft AI Odyssey Will Prepare 100,000 Developers in India for the Future appeared first on AiThority.

]]>
The Microsoft AI Odyssey Will Prepare 100,000 Developers in India for the Future

The month-long program has two levels that developers need to complete for a chance to win VIP passes to Microsoft’s upcoming AI Tour

Microsoft announced the launch of an initiative that aims to skill 100,000 developers in India in the latest AI technologies and tools. The program offers comprehensive learning experience that helps developers acquire and demonstrate the relevant skills needed to execute critical projects using AI technologies that align with business goals and outcomes. With AI revolutionizing the way we work and live, India needs a skilled workforce that can use AI to solve complex problems and create value. Through programs like AI Odyssey, Microsoft is offering opportunities for developers to build solutions for India’s growth and showcase their talent to solve real-world problems. To participate in the AI Odyssey, developers need to register on aka.ms/AIOdyssey and access the learning modules and resources. The month-long program is open to all AI enthusiasts in India, regardless of experience level or background. The program has two levels that participants need to complete by January 31, 2024. The first level of the program educates participants on how to use Azure AI services to create and deploy AI solutions for different scenarios. This will give them access to useful resources, code samples, and guides to master practical AI skills.

AIThority Predictions Series 2024 banner

Recommended AI News: Riding on the Generative AI Hype, CDP Needs a New Definition in 2024

The second level of the program challenges participants to prove their AI skills by completing an online assessment with interactive lab tasks that earns them Microsoft Applied Skills credentials, a new verifiable credential that validates their ability to solve real world problems with AI. Participants completing both the levels of challenges will also stand a chance to win a VIP Pass to attend the Microsoft AI Tour in Bangalore on February 8, 2024, and witness the revolutionary impact of generative AI. The AI Tour is a one-of-a-kind event that showcases how generative AI can enable new forms of creativity, collaboration, and problem-solving. The tour features keynote sessions, demos, and workshops, where developers can learn from Microsoft experts and partners, as well as network with peers.

Recommended AI News: WiMi Developed RPSSC Technology With Multiple Advantages in Hyperspectral Image Processing

Irina Ghose, managing director, Microsoft India, said, “AI is the future of innovation and India is leading the way with its tech talent. The Microsoft Applied Skills credential will help developers demonstrate their competence and creativity in the most in-demand AI skills and scenarios. We welcome all developers to join us in creating meaningful AI solutions that will contribute to India’s economy.”

Recommended AI News: OPENLANE Launches Visual Boost AI to Pinpoint Vehicle Damage

[To share your insights with us, please write to sghosh@martechseries.com]

The post The Microsoft AI Odyssey Will Prepare 100,000 Developers in India for the Future appeared first on AiThority.

]]>
SurveyMonkey Announces New Trends in Business, Culture, and Social https://aithority.com/natural-language/chatgpt/surveymonkey-announces-new-trends-in-business-culture-and-social/ Mon, 08 Jan 2024 18:11:36 +0000 https://aithority.com/?p=556265 SurveyMonkey Announces New Trends in Business, Culture, and Social

Annual study leverages insights from 17 million active users asking 20 million questions on the SurveyMonkey platform every day SurveyMonkey, a global leader in online surveys and forms,announced the findings of its second annual State of Surveys report. Analyzing 11 years’ worth of anonymized and aggregated SurveyMonkey data for trends and anomalies, researchers identified the evolving […]

The post SurveyMonkey Announces New Trends in Business, Culture, and Social appeared first on AiThority.

]]>
SurveyMonkey Announces New Trends in Business, Culture, and Social

Annual study leverages insights from 17 million active users asking 20 million questions on the SurveyMonkey platform every day

SurveyMonkey, a global leader in online surveys and forms,announced the findings of its second annual State of Surveys report. Analyzing 11 years’ worth of anonymized and aggregated SurveyMonkey data for trends and anomalies, researchers identified the evolving ways in which people ask and answer questions in the modern era. From this analysis, a handful of themes emerged to define the current state of surveys: the rise of AI, gender inclusivity around the world, our increasingly mobile world, employee engagement, and survey best practices.

AIThority Predictions Series 2024 banner

Recommended AI News: Level Access Agrees to Acquire UserWay

“The knowledge our customers seek is a fascinating reflection of how the world is evolving”

Key themes include:

People are curious about AI

After the introduction of ChatGPT at the end of 2022, terms like “AI,” “artificial intelligence,” “ChatGPT,” and “generative AI” all saw increased use in surveys in 2023.

  • Mentions of these AI-related terms grew 4x, appearing in just 0.5% of surveys deployed on the SurveyMonkey platform in 2022 to 3% by September 2023, indicating a growing use of consumer-facing language.
  • Specifically, the term “AI” was used 10x more than in previous years, highlighting the explosion of general awareness around the technology.

Gender-inclusive surveys are growing worldwide

As gender inclusivity has expanded, the number of gender answer options used in surveys around the world has grown as well, with a handful of countries setting the pace for the rest of the world.

  • Within the U.K. and Canada in 2023, 8 out of 10 of the surveys sent included more than two gender answer options. In 2012, this number was closer to 2 out of 10.
  • Canada (84%), the U.K. (82%), and Australia (80%) led the way with the highest percentage nonbinary gender-inclusive answers in surveys in 2023.

More people are taking surveys on mobile devices, and employee and customer feedback remain in demand

It’s clear people want to fill out surveys and forms on their phones, increasingly favoring a mobile-first experience.

  • Mobile survey-taking around the world has grown from 52% in 2020 to 57.2% in 2023.
  • In the U.S., people are now as likely to respond to a survey on mobile as they are on desktop. At the end of 2020, 16% more people took surveys on desktop than mobile. By the end of 2023, that gap had shrunk to 0.6%.

With connection top of mind, businesses of all sizes continue to prioritize collecting customer feedback and employee engagement (particularly with remote workers).

  • The most-used survey templates for CX, HR, and marketing professionals in 2023 featured customer satisfaction, employee engagement, meeting feedback, Net Promoter Score (NPS), and name testing.
  • Event registration and RSVP forms top the list of the most used forms, especially as in-person events increase.

Recommended AI News: Trust Stamp Partners With Scurid to Deliver AI-Powered In-Field Authentication

Survey best practices emerge

  • Less is more! Surveys are getting shorter on average, with the number of questions per survey trending down by around 15% over the past six years—from 13.2 questions in 2017 to 11 questions in 2023.
  • Survey response submissions are higher during the week—with four in five survey submissions happening Monday through Friday. Survey completion rates are highest on Wednesday, Thursday, and Friday and lowest on Sunday and Monday.
  • Surveys taken on mobile devices are slightly more likely to be submitted on weekends compared to those taken on non-mobile devices (24% vs. 20%).
  • Template usage doubled in 2023. Surveys created from a template deliver 4 percentage points higher completion rates compared to surveys created from scratch.
  • Surveys that used the SurveyMonkey question bank, a library of thousands of commonly asked questions written by survey methodologists, had a 9 percentage points increase in completion rates compared to surveys that did not use this resource.

“The knowledge our customers seek is a fascinating reflection of how the world is evolving,” said Lara Belonogoff, senior director of brand management at SurveyMonkey. “The true value lies in how survey creators put the insights gleaned from the data they collect into action for good: embracing gender inclusion, empowering businesses with AI, and engaging one another in ways that drive human connection. These are the tangible outcomes of a well-designed, expertly executed survey that drive meaningful decisions, raise the bar for human experiences, and inspire ongoing curiosity.”

Recommended AI News: Trust Stamp Partners With Scurid to Deliver AI-Powered In-Field Authentication

[To share your insights with us, please write to sghosh@martechseries.com]

The post SurveyMonkey Announces New Trends in Business, Culture, and Social appeared first on AiThority.

]]>
Dexios Chooses Maverick Medical AI as Its AI Medical Coding Solution https://aithority.com/machine-learning/dexios-chooses-maverick-medical-ai-as-its-ai-medical-coding-solution/ Mon, 08 Jan 2024 18:05:23 +0000 https://aithority.com/?p=556267 Dexios Chooses Maverick Medical AI as Its AI Medical Coding Solution

Through this partnership, Dexios will leverage Maverick’s AI-powered Autonomous Medical Coding Platform to help scale its operations, improve consistency across its coding team, and reduce coding-related denials. Maverick Medical AI, or Maverick, a provider of an innovative and autonomous AI-powered medical coding platform, partners with Dexios, a radiology-specific revenue-cycle management (RCM) company serving imaging centers and […]

The post Dexios Chooses Maverick Medical AI as Its AI Medical Coding Solution appeared first on AiThority.

]]>
Dexios Chooses Maverick Medical AI as Its AI Medical Coding Solution

Through this partnership, Dexios will leverage Maverick’s AI-powered Autonomous Medical Coding Platform to help scale its operations, improve consistency across its coding team, and reduce coding-related denials.

Maverick Medical AI, or Maverick, a provider of an innovative and autonomous AI-powered medical coding platform, partners with Dexios, a radiology-specific revenue-cycle management (RCM) company serving imaging centers and hospital-based radiologists throughout the U.S. Through this partnership, Dexios integrates Maverick’s AI-powered Autonomous Medical Coding Platform to augment its medical coding team, boost productivity, and achieve a higher direct-to-bill rate.

AIThority Predictions Series 2024 banner

Recommended AI News: Level Access Agrees to Acquire UserWay

“We are proud to partner with a leading RCM that caters to the radiology market”

Healthcare providers industry-wide have faced non-clinical burdens, including a shortage of medical coders that has impacted the revenue cycle management (RCM) process, resulting in a loss of efficiency. A poor medical coding process will typically result in billing mistakes, which can cost healthcare providers millions in revenue each year.

Maverick combines extensive domain knowledge of medical coding with cutting-edge machine learning and generative AI. The AI engine autonomously analyzes clinical notes and reports, accurately generating reimbursement codes (ICD-10, CPT) in real-time and allowing for charges to be available to billing systems within seconds. Maverick’s Generative AI model, which learns from the collection of real-time feedback from auditors and medical coders, results in consistently compliant coding and industry-leading direct-to-bill rates.

Maverick’s platform has already been integrated into Dexios’s system, which serves several radiology clinics, imaging centers, and hospitals across 16 states. As an RCM service provider, Dexios leverages Maverick’s Autonomous Medical Coding platform to support its entire operation by underpinning the work of its medical coders—achieving an industry-vaulting 85 percent direct-to-bill rate.

Recommended AI News: Trust Stamp Partners With Scurid to Deliver AI-Powered In-Field Authentication

“We are proud to partner with a leading RCM that caters to the radiology market,” says Yossi Shahak, CEO and Co-Founder of Maverick Medical AI. “Our decision to embrace Generative AI in our platform’s development was a game-changer. It’s the ideal solution for medical coding, with proven results. We’ve achieved an impressive 85% direct-to-bill rate, revolutionizing the medical coding industry and significantly alleviating the administrative burdens on healthcare providers.”

Kyle Tucker, CEO of Dexios and President-elect of HBMA, stated, “AI will fundamentally change the way that the RCM industry functions. It will be as dramatic as when the industry went from being mostly manual and paper-driven to computers. Unlike then, however, it won’t take a decade to evolve and mature. We should be thinking in terms of months and not years this time. The AI revolution is already here, and the difference between the AI-driven RCM companies and the non-AI-driven companies will be as stark as those RCM companies still doing things manually when their competitors were computerized. We partnered with Maverick because they were on the leading edge of AI technology and had invested in doing things the right way. Dexios’ focus has always been on coding accuracy. We wanted to partner with someone who had the same passion for detail and accuracy that we did. We found that in Maverick.”

“Maverick Medical AI has supercharged our ability to code accurately and quickly for our clients, allowing our team to strategize on how to reduce denials on the front end, which will increase Net Collection Percentage for our clients. I proceed cautiously with implementing new technology and had concerns based on past experience with computerized aided coding, but Maverick Medical AI has exceeded my expectations,” says Billie Artcliff, President of Dexios.

Recommended AI News: Trust Stamp Partners With Scurid to Deliver AI-Powered In-Field Authentication

[To share your insights with us, please write to sghosh@martechseries.com]

The post Dexios Chooses Maverick Medical AI as Its AI Medical Coding Solution appeared first on AiThority.

]]>
Stacked Infrastructure Expands its AI-Ready Data Center Capabilities to Support Machine Learning https://aithority.com/natural-language/chatgpt/stacked-infrastructure-expands-its-ai-ready-data-center-capabilities-to-support-machine-learning/ Mon, 08 Jan 2024 15:14:43 +0000 https://aithority.com/?p=556237 Stacked Infrastructure Expands its AI-Ready Data Center Capabilities to Support Machine Learning

STACK’s proven design supports the next generation of data centers and their artificial intelligence workload requirements STACK Infrastructure (“STACK”), the digital infrastructure partner to the world’s most innovative companies and a leading global developer and operator of data centers, positions itself as the premier developer for the future era of data centers, leveraging its established AI-Ready […]

The post Stacked Infrastructure Expands its AI-Ready Data Center Capabilities to Support Machine Learning appeared first on AiThority.

]]>
Stacked Infrastructure Expands its AI-Ready Data Center Capabilities to Support Machine Learning

STACK’s proven design supports the next generation of data centers and their artificial intelligence workload requirements

STACK Infrastructure (“STACK”), the digital infrastructure partner to the world’s most innovative companies and a leading global developer and operator of data centers, positions itself as the premier developer for the future era of data centers, leveraging its established AI-Ready capabilities built upon STACK’s proven, customizable design. With a history of supporting high-density workloads, STACK reinforces its position as a pioneering partner to the world’s foremost technology providers.

AIThority Predictions Series 2024 banner

AiThority Interview Insights: AiThority Interview with Ramsey Masri, Chief Executive Officer at Ceres Imaging

In the rapidly evolving landscape of technology, characterized by generational advancements in AI, the demand for data center capacity capable of accommodating high-density workloads is reaching unprecedented levels. A prime example is OpenAI’s ChatGPT, the generative AI chatbot launched just over a year ago and, within two months, became the fastest-growing consumer application in history. With such remarkable growth of AI and other machine learning applications, STACK’s cutting-edge data center solutions and its extensive portfolio of powered land holdings have become even more crucial to today’s technology companies. Purposefully crafted to meet the escalating high-density requirements of clients, STACK’s resources are positioned at the forefront of innovation.

“STACK was built for the world’s largest innovators, and from the beginning, we have prioritized the long-term scalability of our clients with a flexible data center design,” said Matt VanderZanden, Chief Operating Officer of STACK Americas. “As an AI-Ready digital infrastructure company with a proven track record of high-density deployments, STACK is building upon this foundation to continually address evolving client needs.”

STACK achieves optimal cooling for high-density AI workloads through its closed-loop water cooling systems, offering flexibility to meet diverse cooling requirements including customizable solutions that can support up to 30kW per rack with traditional air cooling; up to 50kW per rack with rear door heat exchangers; up to 100kW per rack with direct-to-chip liquid cooling; and up to and exceeding 300+kW per rack with immersion cooling in the future. STACK’s extensive portfolio of operating data centers, all designed based on cost-efficient and proven models, exemplifies its ability to swiftly deliver gigawatts of scale, coupled with expertise in cooling systems enabling greater and greater deployment densities, and affirms STACK as the ideal data center developer for sustained growth in the age of AI.

Read More about AiThority InterviewAiThority Interview with Jim Kaskade, Chief Executive Officer at Conversica

STACK’s global footprint of campuses spans major data center markets across the Americas, EMEA, and APAC, offering a vast powered land portfolio, sustainable building solutions, and high-density cooling optionality for the next generation of AI-Ready data centers. With 2.5+GW built or under development and an additional 4.0+GW of planned and potential development, STACK strategically supports the hyperscale requirements and groundbreaking initiatives of leading global technology firms through the development of AI-Ready campus developments including:

  • A planned five-building data center campus offering 250MW of scale in Central Phoenix with a dedicated on-site substation.
  • A 48MW Santa Clara data center, featuring immediately available shell space powered by an onsite substation with rare contracted capacity.
  • A 56MW Toronto campus, spanning 19 acres, includes an existing 8MW data center and 48MW expansion capacity, all supported by committed power.
  • A 48MW build-to-suit opportunity in the Dallas/Fort-Worth area, boasting abundant power and connectivity options.
  • A 200MW campus in Portland spanning 55 acres with 96MW currently available for leasing.
  • A New Albany, Ohio 58MW data center campus with immediately available capacity and build-to-suit expansion opportunities.
  • A strategically located data center campus in Osaka, Japan with 72MW of capacity across three planned buildings.
  • A 36MW facility delivered in Australia, launching a 72MW campus in Melbourne, one of the fastest growing markets in Asia Pacific.
  • A 30MW data center campus in Stockholm with 18MW under development.

AiThority Interview Insights : AiThority Interview with Itamar Kandel, Chief Executive Officer at Vista.ai

 [To share your insights with us, please write to sghosh@martechseries.com] 

The post Stacked Infrastructure Expands its AI-Ready Data Center Capabilities to Support Machine Learning appeared first on AiThority.

]]>
DEEPX CEO Lokwon Kim to Speak at CES 2024 Panel on the Future of AI Hardware and Chips https://aithority.com/machine-learning/deepx-ceo-lokwon-kim-to-speak-at-ces-2024-panel-on-the-future-of-ai-hardware-and-chips/ Mon, 08 Jan 2024 15:03:32 +0000 https://aithority.com/?p=556234 DEEPX CEO Lokwon Kim to Speak at CES 2024 Panel on the Future of AI Hardware and Chips

DEEPX, a leading original AI chip company, is thrilled to announce its participation in the esteemed global IT and electronics exhibition, CES 2024, taking place in Las Vegas, USA, from January 9 to 12. DEEPX, under the guidance of its CEO, Lokwon Kim, will showcase the future of ultra-low power on-device AI solutions through innovative AI chips. Read […]

The post DEEPX CEO Lokwon Kim to Speak at CES 2024 Panel on the Future of AI Hardware and Chips appeared first on AiThority.

]]>
DEEPX CEO Lokwon Kim to Speak at CES 2024 Panel on the Future of AI Hardware and Chips

DEEPX, a leading original AI chip company, is thrilled to announce its participation in the esteemed global IT and electronics exhibition, CES 2024, taking place in Las VegasUSA, from January 9 to 12. DEEPX, under the guidance of its CEO, Lokwon Kim, will showcase the future of ultra-low power on-device AI solutions through innovative AI chips.

AIThority Predictions Series 2024 banner

Read More about AiThority InterviewAiThority Interview with Jim Kaskade, Chief Executive Officer at Conversica

To discover DEEPX’s ultra-low power AI chip solutions, visit Booth #8953, North Hall, at CES 2024.

CES 2024 is poised to be a pivotal year for on-device AI, with industry giants such as Intel and Qualcomm set to deliver keynote speeches and engage in discussions on the transformative potential of AI. DEEPX is proud to contribute to this dialogue by unveiling its ‘All-in-4 AI Total Solution’ at CES, comprised of four groundbreaking products complemented by a real-time technology demonstration that enables on-device AI.

Notably, DEEPX CEO Lokwon Kim has been invited to join global luminaries from the AI hardware and semiconductor sectors to discuss AI hardware, semiconductor technologies and market trends. The talk, “The Hard Part of AI: Hardware and Chips,” will take place on Wednesday, January 10, from 9:00 am to 9:40 am local time at the Las Vegas Convention Center, North Hall N250. The conversation will explore the global adoption of on-device machine vision and edge AI, market demand for hardware and on-device solutions to democratize generative AI, innovative AI chips and technology convergence, and industry challenges and opportunities.

A recent Gartner report from 2023 underscores the growing significance of AI-driven computer vision and edge AI markets, highlighting their potential to shape the future. In particular, on-device AI, excluding edge servers, requires AI capabilities to be implemented, bypassing servers or the cloud. The market is expanding due to computer vision capabilities like facial recognition, voice recognition, and photo editing in various devices, including smart mobility and robotics, the Internet of Things, and physical security systems.

McKinsey predicts the global smart mobility market will hit $1.5 trillion by 2030. A Grand View Research report forecasts the global smart home appliance market will soar to $58.51 billion in 2030. Furthermore, the global camera module market is expected to reach $60.44 billion by the same year. Cameras and sensor-based modules such as radar, lidar, and ultrasound also require real-time AI computational processing. Thus, the demand for AI chips implementing computer vision is likely to be massive.

AiThority Interview Insights: AiThority Interview with Ramsey Masri, Chief Executive Officer at Ceres Imaging

Nonetheless, the realization of large-scale AI edge devices has been challenging due to the limitations imposed by battery-powered, on-device environments, which often need more cooling technology and hardware resources. To overcome these hurdles, DEEPX has pioneered low-power, high-performance AI chip source technologies, including INT8 model compression and efficient memory utilization that minimizes DRAM and cache memory, achieving unparalleled power-to-performance ratios.

A noteworthy development in on-device AI is the integration of Large Language Models (LLMs), facilitating functions such as chatbots, translation, summarization, writing, and coding. This shift is driven by the imperatives of data privacy, real-time processing, reduction in cost and power consumption, and the diversification of edge device applications. While the market for AI PCs and on-device hardware running generative AI models is nascent, lightweight, and advanced generative AI models that cater to users’ evolving needs are poised to emerge soon.

CEO Lokwon Kim remarked, “This marks our inaugural participation in CES 2024, and we are delighted to unveil our independently developed technology through an exclusive booth. It is an honor to receive three CES Innovation Awards and be selected to participate in the CES Panel Talk event as a representative of an AI chip company. DEEPX is dedicated to democratizing AI, making it accessible to all corners of the globe. We have successfully delivered our first semiconductor prototypes to global customers and are preparing for mass production and widespread adoption. In the foreseeable future, DEEPX’s AI chips will usher in remarkable changes in our lives and society, driving innovations in smart mobility, robotics, autonomous vehicles, physical security systems, and factory automation.”

AiThority Interview Insights : AiThority Interview with Itamar Kandel, Chief Executive Officer at Vista.ai

 [To share your insights with us, please write to sghosh@martechseries.com] 

The post DEEPX CEO Lokwon Kim to Speak at CES 2024 Panel on the Future of AI Hardware and Chips appeared first on AiThority.

]]>
Expedera NPUs Run Large Language Models Natively on Edge Devices https://aithority.com/machine-learning/expedera-npus-run-large-language-models-natively-on-edge-devices/ Mon, 08 Jan 2024 14:49:07 +0000 https://aithority.com/?p=556231 Expedera NPUs Run Large Language Models Natively on Edge Devices

Expedera NPU IP adds native support for LLMs, including stable diffusion Expedera, Inc, a leading provider of customizable Neural Processing Unit (NPU) semiconductor intellectual property (IP), announced that its Origin NPUs now support generative AI on edge devices. Specifically designed to handle both classic AI and Generative AI workloads efficiently and cost-effectively, Origin NPUs offer […]

The post Expedera NPUs Run Large Language Models Natively on Edge Devices appeared first on AiThority.

]]>
Expedera NPUs Run Large Language Models Natively on Edge Devices

Expedera NPU IP adds native support for LLMs, including stable diffusion

Expedera, Inc, a leading provider of customizable Neural Processing Unit (NPU) semiconductor intellectual property (IP), announced that its Origin NPUs now support generative AI on edge devices. Specifically designed to handle both classic AI and Generative AI workloads efficiently and cost-effectively, Origin NPUs offer native support for large language models (LLMs), including stable diffusion. In a recent performance study using the open-source foundational LLM, Llama-2 7B by Meta AI, Origin IP demonstrated performance and accuracy on par with cloud platforms while achieving the energy efficiency necessary for edge and battery-powered applications.

Recommended AI News: Riding on the Generative AI Hype, CDP Needs a New Definition in 2024

AIThority Predictions Series 2024 bannerRecommended AI News: GiantLeap Capital Invests in Articul8, Intel’s Enterprise Generative AI Spin-off

LLMs bring a new level of natural language processing and understanding capabilities, making them versatile tools for enhancing communication, automation, and data analysis tasks. They unlock new capabilities in chatbots, content generation, language translation, sentiment analysis, text summarization, question-answering systems, and personalized recommendations. Due to their large model size and the extensive processing required, most LLM-based applications have been confined to the cloud. However, many OEMs want to reduce reliance on costly, overburdened data centers by deploying LLMs at the edge. Additionally, running LMM-based applications on edge devices improves reliability, reduces latency, and provides a better user experience.

“Edge AI designs require a careful balance of performance, power consumption, area, and latency,” said Da Chuang, co-founder and CEO of Expedera. “Our architecture enables us to customize an NPU solution for a customer’s use cases, including native support for their specific neural network models such as LLMs. Because of this, Origin IP solutions are extremely power-efficient and almost always outperform competitive or in-house solutions.”

Expedera’s patented packet-based NPU architecture eliminates the memory sharing, security, and area penalty issues that conventional layer-based and tiled AI accelerator engines face. The architecture is scalable to meet performance needs from the smallest edge nodes to smartphones to automobiles. Origin NPUs deliver up to 128 TOPS per core with sustained utilization averaging 80%—compared to the 20-40% industry norm—avoiding dark silicon waste.

Recommended AI News: Riding on the Generative AI Hype, CDP Needs a New Definition in 2024

[To share your insights with us, please write to sghosh@martechseries.com]

The post Expedera NPUs Run Large Language Models Natively on Edge Devices appeared first on AiThority.

]]>
Syntiant Achieves 50 perce Acceleration of Large Language Models for Edge Devices https://aithority.com/machine-learning/syntiant-achieves-50-perce-acceleration-of-large-language-models-for-edge-devices/ Mon, 08 Jan 2024 13:39:51 +0000 https://aithority.com/?p=556197 Syntiant Achieves 50% Acceleration of Large Language Models for Edge Devices

Syntiant Corp., a leader in edge AI deployment, announced results of its first optimizations that reduce the computational footprint of leading large language model (LLM) architectures, enabling these massive neural networks to run on cloud-free, always-on devices at the edge of networks. “We are focused on making edge AI a reality, bringing low power, highly […]

The post Syntiant Achieves 50 perce Acceleration of Large Language Models for Edge Devices appeared first on AiThority.

]]>
Syntiant Achieves 50% Acceleration of Large Language Models for Edge Devices

Syntiant Corp., a leader in edge AI deployment, announced results of its first optimizations that reduce the computational footprint of leading large language model (LLM) architectures, enabling these massive neural networks to run on cloud-free, always-on devices at the edge of networks.

“We are focused on making edge AI a reality, bringing low power, highly accurate intelligence to edge devices,” said Kurt Busch, CEO at Syntiant. “Our core model optimizations enable considerable processing time acceleration across a range of compute platforms. Those same optimizations, which Syntiant has deployed across millions of devices worldwide, have been successfully applied to LLMs, bringing conversational speech to the edge, serving as the new interface between humans and machines.”

Recommended AI News: Riding on the Generative AI Hype, CDP Needs a New Definition in 2024

AIThority Predictions Series 2024 bannerRecommended AI News: McAfee Debuts Deepfake Audio Detection Tech at CES 2024 to Counter AI-Generated and Disinformation

Syntiant used a novel algorithm to determine the sparsity fraction of LLMs to generate significant speedups in output token generation and reduce memory footprint. The company achieved 50% sparsity with minimal accuracy loss when computing with 8- bit quantized weights, significantly improving interpretability and processing power, while reducing cloud costs. These improvements, along with a custom SIMD (single instruction/multiple data) kernel and several other algorithmic innovations, enable Syntiant to achieve a 50% increase in output token generation speed on a LLaMa-7B benchmark.

“Syntiant continues to lead the category of low power NLP with production-ready solutions that directly address voice AI workload efficiency and could enable smarter conversational endpoints with generative AI. At M12, Microsoft’s Venture Fund, we’re proud to back Syntiant who we feel will help establish a ‘first mile’ standard for future speech-to-speech AI use cases,” said Michael Stewart, partner at M12.

Recommended AI News: ALDI’s Digital Supply Chain Leap in Asia with NEO Platform

Demoed live at CES 2024 in Las Vegas, Syntiant’s optimizations demonstrate an adjacent class of LLMs that run entirely at the edge. The ability to enable significant processing time acceleration brings meaningful benefits to end users from both latency and privacy perspectives, across a wide variety of consumer and commercial use cases, from earbuds to automobiles.

[To share your insights with us, please write to sghosh@martechseries.com]

The post Syntiant Achieves 50 perce Acceleration of Large Language Models for Edge Devices appeared first on AiThority.

]]>
Generative AI Inside Programmable NFTs Is New — and Can Accelerate the Sector Beyond Hype https://aithority.com/technology/blockchain/nft/generative-ai-inside-programmable-nfts-is-new/ Mon, 08 Jan 2024 13:30:09 +0000 https://aithority.com/?p=556201 Generative AI Inside Programmable NFTs Is New

In their infancy, NFTs have suffered mightily from both misconceptions about the technology behind them and manipulation by fraudsters and hackers. Issues have arisen because many people, including creators, rushed to treat them as monetized cryptocurrency projects, tradable assets, and get-rich-fast investments. The more thoughtful, functional opportunities possible with them have often been overshadowed by […]

The post Generative AI Inside Programmable NFTs Is New — and Can Accelerate the Sector Beyond Hype appeared first on AiThority.

]]>
Generative AI Inside Programmable NFTs Is New

In their infancy, NFTs have suffered mightily from both misconceptions about the technology behind them and manipulation by fraudsters and hackers. Issues have arisen because many people, including creators, rushed to treat them as monetized cryptocurrency projects, tradable assets, and get-rich-fast investments. The more thoughtful, functional opportunities possible with them have often been overshadowed by hype, speculation, speed, and greed.

That’s finally starting to change — and it isn’t because brands made them collectibles or high-end coupons. It’s because evolving technical capabilities are steadily improving NFT functionality.

It’s still true that an NFT refers to an asset, most often an image, tokenized with a unique digital signature recorded on a decentralized, immutable blockchain like the Ethereum network. This signature on a distributed blockchain ledger, across thousands of computers worldwide, ensures that the NFT’s existence and transfer can be verified with an extremely high level of certainty — without relying on a central authority like a bank or broker. That’s well and good, but the public has largely remained unable to see this in practice, translate what it means, or find relevant applications to their lives and work.

With the emergence of programmable NFTs in 2023, we can now directly embed content and executable programs inside an NFT, which creates a significant difference in how we experience them, their utility, and even the jobs they can perform for us. My team’s new Immutable Miniverse Format (IMF), for example, immutably encodes — and can encrypt — tailored messages, games, music, binaries, and interactive experiences within an NFT, visualized as a string of secret code in motion. Importantly, this self-contained “miniverse” is embedded as part of the NFT artwork — without the technology burden of maintaining Web3 access points when being used on traditional social media.

Recommended: 10 AI ML In Personal Healthcare Trends To Look Out For In 2024

The Next Level with Generative AI

Fusing NFT art and programming code in a self-contained way is a significant development.

Programmable NFTs can be leveraged for work and play, shared on today’s social media platforms (Web2) and even serve as the building blocks of a kind of “app store” for Web3 and a more decentralized economy, comparable to today’s mobile phone apps. Additionally, the use cases in everyone’s mind involve generative AI technologies.

For good reason: the ultimate NFT utility may be the capacity to tokenize intelligence in a decentralized and immutable fashion.

Here, we’re not talking about using generative AI to churn out NFT art, rather the focus is on integrating LLM chat capabilities into programmable NFTs and ultimately empowering owners to program and mint their own NFTs to do this. My team is enabling NFT holders to have a private conversation with and within their NFT about the domain and knowledge embedded inside their NFT’s miniverse. This means leveraging a customized, fine-tuned version of an LLM, while not subjecting any input to third-party services that could opaquely use the data or violate privacy — a criticism leveled at OpenAI and others in a typical centralized GPT setting.

In fact, if you’re maximizing the strengths of both technologies, NFTs and gen AI, then:

  • The embedded information on which an LLM conversation is based should be immutable, individualized, and decentralized.
  • The conversation with such embedded information must preserve privacy. It does not need to reference any public GPT services.
  • NFT creators must be committed to decentralized AI on Web3. Future iterations should and will allow a user to run their LLM model locally on the user’s own computer.

The protection of privacy possible with NFTs is a differentiator compared with using generative AI in a web application. A criticism of ChatGPT, BARD, and similar systems is that input from a user becomes data consumed by the generative AI, including proprietary input, and that can ultimately become available to other users. A generative tool like Microsoft’s Copilot stores input data, while assurances are made that it isn’t used for training large language models.

Still, the concern is that the power of massive data and its control becomes centralized even more by the hyperscaler giants of computing and the few smaller companies they back.

Blockchain’s fundamental decentralization and distribution means that generative AI NFTs can offer an alternative to that system of big data control.

Additionally, AI, as with mobile now, can leverage edge computing, not centralized cloud services.

Long-term Uses of Programmable NFTs in Life and Business

The viability of a technology depends on its ability to solve real problems and to create new possibilities and experiences beneficial to humanity and the world. Right now, we’re seeing a tremendous blooming of new uses for generative AI. When sharing information or when interaction needs to be secure, private, and impervious to short- or long-term systems failure or third-party dependency, the integration of NFTs and AI will prove especially handy.

In the personal realm, consider how you might preserve and interact with family history and important private documentation shared among family members over the long term, without pieces of that being scattered or permanently lost. Some families have members who live across the world and with whom they want to share information. Most people have family members dear to them who have passed away or will do so. Imagine an interactive record of your grandparents’ lives — one that you can have a conversation with. Imagine documenting family events from yesterday, five years ago, or fifty years ago that are less likely to be lost to time in a decentralized system. With programmable, AI-capable NFTs, you and your specified family members can own the assets with the data, the knowledge, the documents, and the interactive capabilities, as long as you have an internet connection. Digital existence and private access are distributed and not dependent on permission granted by a central authority or third party.

The same principles hold true for business uses.

Consider proprietary documentation, blueprints, and even trade secrets that need to be shared among specific engineers across a global workforce or from an older generation to a newer one within a company.

With an NFT, you could securely encrypt documentation, share it with an intended party, and, with generative AI capabilities, enable them to ask questions of it — even, in a sense, converse with engineers retired or long gone. Questions and answers about the documentation can instantly be translated into other languages.

Because NFTs are immutable and recorded permanently in a distributed public registry, impervious to deletion or third-party tampering, their potential for authenticating products and verifying lineage in the supply chain has often been noted.

Adding a layer of interactive inquiry with embedded generative AI makes this business use case even more exciting.

A programmable NFT or immutable “miniverse” with generative AI functionality is a new frontier in interactive, customized applications for personal and business use that truly puts ownership in individual hands. It’s time to lift NFTs out of the abyss of financial speculation and esoteric understanding and put the spotlight on their technical capabilities — which can make them useful for everyone.

[To share your insights with us, please write to sghosh@martechseries.com]

 

The post Generative AI Inside Programmable NFTs Is New — and Can Accelerate the Sector Beyond Hype appeared first on AiThority.

]]>
Ceva’s NPU IP Thrives: Revitalized by Auto and Edge AI Collaborations https://aithority.com/technology/cevas-npu-ip-thrives-revitalized-by-auto-and-edge-ai-collaborations/ Mon, 08 Jan 2024 12:57:23 +0000 https://aithority.com/?p=556168 Ceva Expands AI Ecosystem for its Class-Leading NPU IP with New Partnerships for Automotive and Edge AI

Visionary.ai neural network software ISP for enhanced camera applications and ENOT.ai neural network optimization tools and AI assistance now available for Ceva’s NeuPro-M NPU Ceva, Inc. the leading licensor of silicon and software IP that enables Smart Edge devices to connect, sense and infer data more reliably and efficiently, announced that it has expanded the […]

The post Ceva’s NPU IP Thrives: Revitalized by Auto and Edge AI Collaborations appeared first on AiThority.

]]>
Ceva Expands AI Ecosystem for its Class-Leading NPU IP with New Partnerships for Automotive and Edge AI

Visionary.ai neural network software ISP for enhanced camera applications and ENOT.ai neural network optimization tools and AI assistance now available for Ceva’s NeuPro-M NPU

Ceva, Inc. the leading licensor of silicon and software IP that enables Smart Edge devices to connect, sense and infer data more reliably and efficiently, announced that it has expanded the AI ecosystem for its class-leading NeuPro-M NPU IP with the addition of two new partnerships targeting automotive and vision-based Edge AI applications.

Ran Snir, Vice President and General Manager of the Vision Business Unit at Ceva, remarked: “I would like to extend a warm welcome to Visionary.ai and ENOT.ai, who join the ecosystem of partners that support our Smart Edge customer base. These partners are bringing new levels of innovation to solve complex challenges using AI, and illustrate the value proposition of our NeuPro-M NPU to enable scalable AI workloads on-device with highest power efficiency.”

Recommended AI News: Riding on the Generative AI Hype, CDP Needs a New Definition in 2024

AIThority Predictions Series 2024 banner

Visionary.ai

Visionary.ai empowers cameras to capture video in extreme low-light and HDR scenarios, with results that surpass human vision. Traditionally, low light and nighttime video has been one of the greatest imaging challenges, often resulting in dark, grainy, and unclear video. This new technology changes the game, enabling cameras to deliver unparalleled low-light performance. The technology can be deployed on the NeuPro-M NPU to enhance multiple Edge AI vision applications for a range of markets, including mobile, PC, IoT, robotics, automotive, security, smart cities, and medical imaging.

Oren Debbi, CEO of Visionary.ai. stated: “At Visionary.ai, we have managed to surpass human vision in extreme low-light conditions by harnessing the incredible power of AI. Through partnerships with leading Edge AI NPU companies like Ceva, we can bring this technology to a global audience, improving safety and functionality of vision-based Smart Edge devices by making the unseen seen.”

Recommended AI News: GiantLeap Capital Invests in Articul8, Intel’s Enterprise Generative AI Spin-off

ENOT.ai

ENOT.ai specializes in neural network optimization, revolutionizing autonomous driving and Advanced Driver-Assistance Systems (ADAS). The collaboration combines Ceva’s NeuPro-M NPU with ENOT.ai’s revolutionary AI assistant technology, aimed at enhancing automotive safety, efficiency, and driver experience. Key features of this collaboration include an AI assistant that provides real-time lane departure warning, pedestrian detection, and voice instructions for vehicle operation, enhancing safety for new drivers and personalizing settings for an optimized driving experience.  This partnership promises to set new standards in AI-driven automotive solutions, focusing on intuitive driver assistance, vehicle efficiency, and intelligent edge computing capabilities.

David Rapoport, CRO at ENOT.ai, commented: “Ceva’s strong relationships within the automotive ecosystem and their AI NPU with industry-leading efficiency perfectly complement our AI automotive roadmap. This partnership exemplifies our mission to integrate AI seamlessly into everyday life, expanding role of the AI assistant in the automotive cockpit.”

Ceva’s NeuPro-M NPU IP addresses the processing needs of Classic and Generative AI with industry-leading performance and power efficiency for AI inferencing workloads. The NeuPro-M architecture is highly versatile and future proof thanks to an integrated VPU (Vector Processing Unit), supporting any future network layer, and offers high scalability with four NPUs – the NPM11, NPM12, NPM14, and NPM18 to suite any AI workload.

Demonstrations at CES 2024

At CES 2024 in Las Vegas Jan 9 – 12, 2024, Ceva together with Visionary.ai and ENOT.ai will showcase the AI-ISP and automotive AI assistant solutions at the Ceva private meeting suite at the Westgate Hotel. To experience these demonstrations, contact the Ceva events team and book an appointment.

Recommended AI News: Riding on the Generative AI Hype, CDP Needs a New Definition in 2024

[To share your insights with us, please write to sghosh@martechseries.com]

The post Ceva’s NPU IP Thrives: Revitalized by Auto and Edge AI Collaborations appeared first on AiThority.

]]>