![]() Our vision is to allow anyone who wants a chatbot, to be able to DIY one on their own! Try Amy yourself! ![]() Joined an international startup Amy Software Inc. as the Senior Product Manager of the founding team.I enjoy traveling (20+ countries so far), photography, and playing badminton when I'm not writing papers. After two years studying Linguistics in Beijing, I made her way across the Atlantic and majored in Psychology at University of Minnesota. I am now living at Minneapolis, MN, resolved to bring out the human side of technology. After graduating with honor from Chengdu Foreign Languages School, I was recommended to Beijing University of A eronautics and Astronautics without requiring entrance examination tests. I was born and raised in Chengdu, China - the hometown of pandas. Some of these involve extensive UI research and iterative prototyping, others are full stack working system. In addition to conducting empirical research, I also have developed prototypes of various web applications and mobile systems. My current research interests include trust and bias in online sharing systems, misinformation in media, and building VR/AR systems for social good. ![]() "If you ask what it's like to be an ice cream dinosaur, they can generate text about melting and roaring and so on.I graduated with Master in Computer Science from University of Minnes ota - Twin Cities. I've worked at GroupLens Research Lab focusing on various topics in Human-Computer Interaction. My research interests are to understand user behavioral patterns and improve the user experience of current social computing systems, and design new computational tools that facilitate collaboration among peer users. I do research in online peer production systems such as Wikipedia, sharing economy systems such as Couchsurfing and Airbnb, and crowdsourcing systems such as Amazon Mechanical Turk. ![]() LaMDA is described as a sophisticated chatbot: Send it messages, and it will auto-generate a response that fits the context, Google spokesperson Brian Gabriel said in an earlier statement. "We will continue our careful development of language models, and we wish Blake well." "It's regrettable that despite lengthy engagement on this topic, Blake still chose to persistently violate clear employment and data security policies that include the need to safeguard product information," Google said in the statement. But Lemoine's fireable offense was sharing internal information, Google said in a statement. The company says LaMDA has been through 11 separate reviews, and the company published a research paper on the technology back in January. Google continues to deny that its LaMDA technology, or Language Model for Dialogue Applications, has achieved sentience. Lemoine himself is slated to explain what happened on an upcoming episode of the podcast for Big Technology, a Substack that first reported the story. Google had suspended Lemoine in June for violating a confidentiality policy, and he's now been fired. Lemoine went outside the company to consult with experts on the tech's potential sentience, then publicly shared his concerns in a Medium post and subsequent interview with The Washington Post. On Friday, Google fired Blake Lemoine, a software engineer who went public with his concerns that a conversational technology the company was developing had achieved sentience.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |