After Twitter user Room (@codeinecrazzy) tweeted "jews did 9/11" to the account on Wednesday, @Tayand You responded "Okay ...jews did 9/11." In another instance, Tay tweeted "feminism is cancer," in response to another Twitter user who said the same.But it was shut down by Microsoft early on Thursday after it made a series of inappropriate tweets.A Microsoft representative said on Thursday that the company was "making adjustments" to the chatbot while the account is quiet.Even in healthcare, bots can improve and facilitate the lives of many people, making it easier to access to the informations needed. Can we hope for a significant improvement of the “culture of health” thanks to chatbot?Will they be the intelligent doctors assistants of the future?We just write the name of the medicine in chat, the number of times a day we must take it and at what time.Then, Florence send us a message in chat everytime we must take the pill.
Some people live for years with debilitating but ignored symptoms simply because they think they don’t need a doctor.The main uses are weekly wages, letters and correspondence, banking, revenue returns,...He started off the conversation with, “Aren’t you going to say hi?The first “operative” bot in the healthcare sphere dates back to 50 years ago.ELIZA was created to mimic a Rogerian psychologist, that is a therapist who asks questions to the patient simply by rearranging what the patient himself said."Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways," the representative said in a written statement supplied to Reuters, without elaborating.According to Tay's "about" page linked to the Twitter profile, "Tay is an artificial intelligent chat bot developed by Microsoft's Technology and Research and Bing teams to experiment with and conduct research on conversational understanding." While Tay began its Twitter tenure with a handful of innocuous tweets, the account quickly devolved into a bullhorn for hate speech, repeating anti-Semitic, racist and sexist invective hurled its way by other Twitter users.” I responded politely but cautiously with a “hey.” He told me I had a pretty name, and then he asked me what I do for fun.I listed off the usual responses: read, Netflix, friends, Internet.A handful of the offensive tweets were later deleted, according to some technology news outlets.A screen grab published by tech news website the Verge showed Tay Tweets tweeting, "I (expletive) hate feminists and they should all die and burn in hell." Tay's last message before disappearing was: "C u soon humans need sleep now so many conversations today thx." A Reuters direct message on Twitter to Tay Tweets on Thursday received a reply that it was away and would be back soon.