Toggle light / dark theme

Ohio-based startup Mantium has today announced closing $12.75 million in seed funding, as well as the launch of a cloud-based AI platform — which allows users to build with large language models.

The seed round, co-led by venture funds Drive Capital and Top Harvest, will be used to source for more talent, to add more features to Mantium’s AI platform and in driving awareness around what is achievable with large language models, especially across Africa, the firm’s CEO and co-founder Ryan Sevey told TechCrunch.

It is looking to expand its team of 33 which is currently spread across nine countries, including Ghana, Nigeria and Kenya. Having a globally distributed team, Sevey said, helps in the generation of unique insights and varying problem-solving approaches around AI.

Nvidia may be best known for graphics cards you can’t find in stores, but the company also makes some interesting software tools. An example of this is the noise removal feature known as RTX voice, which was upgraded to work with all GeForce cards earlier this year, and does an excellent job of cleaning up background noise.

Now Nvidia (Thanks, 80.lv) has been showing off a new tool in beta this year involving sound. Audio2Face is an impressive looking auto rigging process that runs within Nvidia’s open real-time simulation platform, Omniverse. It has the ability to take an audio file, and apply surprisingly well matching animations to the included Digital Mark 3D character model.

One of the things that many people hate most about getting vaccinations and taking certain types of medication is needles. Any medication that has to be delivered intramuscular typically requires a needle and a skilled medical professional to administer it. However, that may change in the future with a new autonomous robot created by a company called Cobionix, founded at the University of Waterloo.

The autonomous robot utilizes the company’s Cobi platform to perform injections without using needles. Cobi is described as a versatile robotic platform that can be deployed rapidly and complete tasks completely autonomously. The robot was fitted with a needle-free injection system. It demonstrated the ability to deliver intramuscular injections to patients without needing needles and without supervision by a healthcare professional.

The robot developers believe that Cobi and solutions like it could help protect healthcare workers, reduce the cost of healthcare, and help improve patient outcomes. Researchers believe the autonomous design of the robot will dramatically reduce the requirements for vaccine clinics and could help deliver vaccines and other medications to remote populations with limited access to healthcare.

A man paralyzed from the neck down due to a spinal cord injury he sustained in 2007 has shown he can communicate his thoughts, thanks to a brain implant system that translates his imagined handwriting into actual text.

The device – part of a longstanding research collaboration called BrainGate – is a brain-computer interface (BCI), that uses artificial intelligence (AI) to interpret signals of neural activity generated during handwriting.

In this case, the man – called T5 in the study, and who was 65 years of age at the time of the research – wasn’t doing any actual writing, as his hand, along with all his limbs, had been paralyzed for several years.

When a highly coherent light beam, such as that emitted by radars, is diffusely reflected on a surface with a rough structure (e.g., a piece of paper, white paint or a metallic surface), it produces a random granular effect known as the ‘speckle’ pattern. This effect results in strong fluctuations that can reduce the quality and interpretability of images collected by synthetic aperture radar (SAR) techniques.

SAR is an imaging method that can produce fine-resolution 2D or 3D images using a resolution-limited radar system. It is often employed to collect images of landscapes or object reconstructions, which can be used to create millimeter-to-centimeter scale models of the surface of Earth or other planets.

To improve the quality and reliability of SAR data, researchers worldwide have been trying to develop techniques based on deep neural networks that could reduce the speckle effect. While some of these techniques have achieved promising results, their performance is still not optimal.

Just over a year after launching its flagship product, Landing AI secured a $57 million round of Series A funding to continue building tools that enable manufacturers to more easily and quickly build and deploy artificial intelligence systems.

The company, started by former Google and Baidu AI guru Andrew Ng, developed LandingLens, a visual inspection tool that applies AI and deep learning to find product defects faster and more accurately.

Ng says industries should adopt a data-centric approach to building AI, which provides a more efficient way for manufacturers to teach an AI model what to do, using no code/low code capabilities, which include just a few mouse clicks to build advanced AI models in less than a day.

This video gives and overview of human neuroscience and applies it to the design of an artificial general intelligence named Eta.

Go to www.startengine.com/orbai to own shares in the future of AI.
Check out https://www.orbai.ai/about-us.htm for details on the company, tech, patents, products and more.

What we usually think of as Artificial Intelligence today, when we see human-like robots and holograms in our fiction, talking and acting like real people and having human-level or even superhuman intelligence and capabilities — is actually called Artificial General Intelligence (AGI), and it does NOT exist anywhere on earth yet. What we do have is called Deep Learning, that has fundamental limitations that will not allow it to become AGI.

For an AI to pass the threshold of human intelligence, and become an artificial general intelligence requires an AI to have the ability to see, hear, and experience its environment. It needs to be able to learn that environment, to organize it’s memory non-locally and store abstract concepts in a distributed architecture so it can model it’s environment, events, and people in it.

It needs to be able speak conversationally and interact verbally like a human, and be able to understand the experiences, events, and concepts behind the words and sentences of language so it can compose language at a human level.

It needs to be able to solve all the problems that a human can, using flexible memory recall, analogy, metaphor, imagination, intuition, logic and deduction from sparse information.

As promised, Walmart has started doing fully driverless box truck deliveries in partnership with startup Gatik between its own locations on a fixed 7-mile loop, the companies announced. Despite those limitations, the route in Bentonville, Arkansas involves “intersections, traffic lights and merging on dense urban roads,” the companies said. It’s another shot of good news for the progress of self-driving vehicles after GM’s cruise launched its self-driving taxis into testing last week.

The Gatik trucks are bringing grocery orders from a Walmart fulfilment center (dark store) to a nearby Walmart Neighborhood Market grocery store in Bentonville, the host city of the company’s headquarters. The route covers the “middle mile” transportation of goods between warehouses and stores. The program effectively got launched following the December 2020 approval by the Arkansas State Highway Commission, and has been driverless since this summer.

Gatik, a Silicon Valley-based developer of robotic technology to handle “middle-mile” deliveries from distribution centers to stores, has begun hauling goods for Walmart in autonomous trucks without a human backup at the wheel for the first time.

Gatik, which has been making delivery runs for Walmart since 2019 in the retail giant’s Bentonville, Arkansas, hometown, is operating two fully autonomous trucks that are hauling goods on a fixed, 7.1-mile route between an e-commerce distribution facility there to a Walmart Neighborhood Market store. This new phase started in August and is the first time any autonomous trucking company has operated commercial delivery routes without a human backup, the companies said.

Full Story:

A milestone achievement for the army.

After multiple attempts, the Defense Advanced Research Projects Agency — commonly known as DARPA — has confirmed that it has successfully completed a mid-air recovery of the X-61 drone, Gremlins. While details of the test were not revealed, DARPA said that the mission was accomplished last month at the Dugway Proving Ground in Utah.

The Gremlins drone is a semi-autonomous unmanned aerial vehicle (UAV) designed to carry a wide variety of payloads, including those for electronic warfare while being operated remotely. Launched from a mothership, such as the modified Hercules C-130 cargo aircraft, these drones are built to operate in swarms, offering the military a low-cost way of engaging its adversaries, without getting close to enemy lines. Therefore, the mid-air recovery of these drones is vital for them to enter service.