Marc / Suman,
Thought this might interest you
Regards,
Hemen
Meta researchers build an AI that learns equally well from visual, written or spoken materials
Extract :
For instance, by reading a thousand books it will learn the relative positions of words and ideas about grammatical structure without anyone telling it what objects or articles or commas are — it got it by drawing inferences from lots of examples.
This feels intuitively more like how people learn, which is part of why researchers like it
The idea for data2vec was to build an AI framework that would learn in a more abstract way, meaning that starting from scratch, you could give it books to read or images to scan or speech to sound out, and after a bit of training it would learn any of those things. It’s a bit like starting with a single seed, but depending on what plant food you give it, it grows into an daffodil, pansy or tulip.
“People experience the world through a combination of sight, sound and words, and systems like this could one day understand the world the way we do,” commented CEO Mark Zuckerberg on the research.
Following are my own ( no doubt , crude ) thoughts on this :
SELF - LEARNING SOFTWARE ……………. Sept 2003
No comments:
Post a Comment