Suman,
I agree that we follow the ROAD MAP that your team has laid out
In the meantime , I asked my son Nirmit to listen to my voice in CHAT
After listening he gave following feedback :
Voice would sound much more like my own , if ,
Ø Speed of speech was slightly reduced . At present , it sounds too fast ( although the PAUSES are very appropriately placed )
Ø Accent was more INDIAN English rather than AMERICAN English
Regards,
hemen
From: Suman Kanuganti [mailto:s@personal.ai]
Sent: 05 November 2023 11:52
To: Hemen Parekh
Cc: Kishan Kokal; Marissa Brassfield; Sharon Zhang; System Admin; nirmit@3pconsultants.co.in; vishesh@incomegroup.in
Subject: Re: SOMEONE IS TRYING TO CATCH-UP WITH YOU
It is time to express the views of Personal AI. The world is getting too much carried away in the hands and likings and strategies of big tech. We have to move what’s in our control which is us forward.
On Sat, Nov 4, 2023 at 11:02 PM Hemen Parekh <hcp@recruitguru.com> wrote:
Dear Friends,
Just came across following report :
Instagram spotted developing a customizable ‘AI friend’ .. 01 Nov 2023
Extract :
Instagram has been spotted developing an “AI friend” feature that users would be able to customize to their liking and then converse with, according to screenshots shared by
app researcher Alessandro Paluzzi. Users would be able to chat with the AI to “answer questions, talk through any challenges, brainstorm ideas and much more,” according to
screenshots of the feature.
The screenshots indicate that users would be able to select the gender and age of the chatbot. Next, users would be able to choose their AI’s ethnicity and personality. For
instance, your AI friend can be “reserved,” “enthusiastic,” “creative,” “witty,” “pragmatic” or “empowering.”
To further customize your AI friend, you can choose their interests, which will “inform its personality and the nature of its conversations,” according to the screenshots. The
options include “DIY,” “animals,” “career,” “education,” “entertainment,” “music,” “nature” and more.
Once you have made your selections, you would be able to select an avatar and a name for your AI friend. You would then be taken to a chat window, where you could
click a button to start conversing with the AI.
.
Instagram declined to comment on the matter. And of course, unreleased features may or may not eventually launch to the public, or the feature may be further changed
during the development process.
The social network’s decision to develop, and possibly release, an AI chatbot marketed as a “friend” to millions of users has risks. Julia Stoyanovich, the director of NYU’s
Centre for Responsible AI and an associate professor of computer science and engineering at the university, told TechCrunch that generative AI can trick users into thinking
they are interacting with a real person.
“One of the biggest — if not the biggest — problems with the way we are using generative AI today is that we are fooled into thinking that we are interacting with
another human, “Stoyanovich said
“We are fooled into thinking that the thing on the other end of the line is connecting with us. That it has empathy. We open up to it and leave
ourselves vulnerable to being manipulated or disappointed. This is one of the distinct dangers of the anthropomorphization of AI, as we call it.”
When asked about the types of safeguards that should be put in place to protect users from risks, Stoyanovich said that “whenever people interact with AI, they have to know
that it’s an AI they are interacting with, not another human. This is the most basic kind of transparency that we should demand.”
The development of the “AI friend” feature comes as controversies around AI chatbots have been emerging over the past year. Over the summer, a U.K. court heard a
case where a man claimed that an AI chatbot had encouraged him to attempt to kill the late Queen Elizabeth days before he broke into the grounds of Windsor Castle. In
March, the widow of a Belgian man who died by suicide claimed that an AI chatbot had convinced him to kill himself.
Other social platforms have launched AI chatbots to mixed results. For instance, Snapchat launched its “My AI” chatbot in February and faced controversy for doing so without
appropriate age-gating features, as the chatbot was found to be chatting to minors about topics like covering up the smell of weed and setting the mood for sex.
It’s not clear which AI tools Instagram would use to power the “AI friend,” but as generative AI booms, the social network’s parent company Meta has already begun
incorporating the technology into its family of apps.
Last month, Meta launched 28 AI chatbots that users can message across Instagram, Messenger and WhatsApp.
Some of the chatbots are played by notable names like Kendall Jenner, Snoop Dogg, Tom Brady and Naomi Osaka. It’s worth noting that the launch of the AI personas wasn’t a
surprise, given that Paluzzi revealed back in June that the social network was working on AI chatbots.
Unlike the “AI friend” chatbot that can chat about a variety of topics, these interactive AI personas are each designed for different interactions. For instance, the AI chatbot
That is played by Kendall Jenner, called Billie, is designed to be an older sister figure that can give young users life advice.
The new “AI friend” chatbot that Instagram appears to be developing seems to be designed to facilitate more open-ended conversations.
Hemen
PS :
During my video meet with Alex ( Personal.ai ), on 13 Dec 2022 , I had remarked :
“ Tie up with FB – Twitter – Linkedin – Instagram . How do they GAIN from API for Q and A ? All of these have PROFILE pages to connect API “
DIFFERNTIATORS ( from other AI apps ) :
# Self branding ( Letting world know what I know
# Recognition ( Announcing my existence to the World / You don’t exist if Google does not know ! )
# Human-like answers
# Not every TOPIC on the earth ( a give-away of all other AIs ) / Only “ things / matters “ I know something about
ReplyReply allForward |
No comments:
Post a Comment