fbpx

Amigo, a white robot the size of a person, uses information gathered by other robots to move towards a table to pick up a carton of milk and deliver it to an imaginary patient in a mock hospital room at the Technical University of Eindhoven, Netherlands, Wednesday Jan. 15, 2014. A group of five of Europe’s top technical universities, together with technology conglomerate Royal Philips NV, are launching an open-source system dubbed “RoboEarth” Thursday. The heart of the mission is to accelerate the development of robots and robotic services via efficient communication with a network, or “cloud”. AP

VANCOUVER, Canada — Intelligent machines of the future will help restore memory, mind your children, fetch your coffee and even care for aging parents.

It will all be part of a brave new world of the not-so-distant future, in which innovative smart machines, rather than being the undoing of people — as some have long feared — actually enhance humans.

That was the vision outlined at the prestigious TED Conference by experts who say technology will allow people to take on tasks they might only have dreamed of in the past.

“Super-intelligence should give us super-human abilities,” Tom Gruber, head of the team responsible for Apple’s Siri digital assistant, said during an on-stage talk Tuesday night.

Smarter machines, smarter humans

“As machines get smarter, so do we,” Gruber said.

“Artificial intelligence can enable partnerships where each human on the team is doing what they do best,” he told the popular technology conference.

Gruber, a co-creator of Siri and artificial intelligence research at Apple, told of being drawn to the field three decades ago by the potential for technology to meet people’s needs.

“I am happy to see that the idea of an intelligent personal assistant is mainstream,” he said.

Now he has set his sights on smart machines, and is turning the thinking about the technology on its head.

“Instead of asking how smart we can make our machines, let’s ask how smart our machines can make us,” Gruber said.

Already smart personal assistants are taking hold, pioneered by the likes of Apple’s Siri.

South Korean giant Samsung created Bixby to break into the surging market for voice-activated virtual assistants, which includes Amazon’s Alexa, Google’s Assistant and Microsoft’s Cortana.

Amazon appears to have impacted the sector the most with its connected speakers using Alexa. The service allows users a wide range of voice interactions for music, news, purchases and connects with smart home devices.

Remembering everything

Gruber envisions artificial intelligence — AI — getting even more personal, perhaps augmenting human memory.

“Human memory is famously flawed — like, where did the 1960s go and can I go there too?” Gruber quipped.

He spoke of a future in which artificial intelligence remembers everyone met during a lifetime and details of everything someone read, heard, said or did.

“From the tiniest clue it could help you retrieve anything you’ve seen or heard before,” he said.

“I believe AI will make personal memory enhancement a reality; I think it’s inevitable.”

Such memories would need to be private, with people choosing what to keep, and be kept absolutely secure, he maintained.

Surefooted robots

Boston Dynamics robotics company founder Marc Raibert was at TED with a four-legged SpotMini robot nimble enough to frolic amid the conference crowd.

He smiled but would not comment when asked by AFP about the potential to imbue the gadget with the kind of artificial intelligence described by fellow speakers.

Raibert did, however, note that the robots are designed to be compatible with new “user interfaces.”

Current virtual assistants have been described as a step into an era of controlling computers by speaking instead of typing or tapping screens.

“I think it won’t be too long before we’re using robots to take care of our parents, or help our children take care of us,” Raibert said.

The ‘gorilla problem

Not everyone at TED embraced the idea of a future in which machines are smarter and more capable than humans, however.

Stuart Russell, a University of California at Berkeley computer sciences professor, referred to the situation as the “gorilla problem” in that when smarter humans came along it boded ill for evolutionary ancestors.

“This queasy feeling that making something smarter than your own species is not a good idea,” said Russell, co-author of the book Artificial Intelligence: A Modern Approach.

As an AI researcher he supported research in the technology.

Russell now advocates programming machines with robotic laws to govern their behavior — and ensure cannot end up working against human interests.

He gave the example of a robot being told to simply fetch coffee.

A machine not constrained by proper principles might decide that accomplishing the task required it to defend against being shut down and remove all obstacles from its path by whatever means necessary.

Russell counseled robot principles including altruism, humility, and making a priority of human values.

“You are probably better off with a machine that is like this,” Russell said.

“It is a first step in human compatibility with AI.” CBB

This article was  published in technology.inquirer.net

Categorized in Science & Tech

This article is by Aaron Agius, cofounder and managing director of Louder.Online, a digital marketing agency.

As far back as 2014, we knew, thanks to data gathered by Google and Northstar Research, that more than 50% of teens and 41% of adults surveyed used voice search—the kind used by Google, Alexa, Siri and Cortana—on a daily basis.

There’s nothing to suggest that these adoption rates have slowed down, especially in light of the successful January 2016 launch of Amazon’s voice-only platform Alexa, which grew seven-fold in its first six months.

This rapid adoption has led some to say that voice search will be the end of organic SEO as we know it. I’m not so certain. While it’s certainly an evolution of organic search, it reminds me too much of another “sky is falling” scenario to suggest that CMOs and other high-level marketing execs should be worried.

The Historical Precedent Of Google’s Knowledge Graph

Back in 2012, Google launched Knowledge Graph, which many industry leaders feared for similar reasons. And certainly, businesses that profited by siphoning Google traffic to answer simple questions were threatened by the change.

That said, it’s important to look at the types of queries affected by the Knowledge Graph launch. The searches affected weren’t queries indicative of a desire for deep knowledge; they were quick answers to quick questions (for example, “How tall is George Clooney?” or “What is the capital of France?”).

I’d argue that, at least until we have a fully semantic web, voice search will have a similar impact. Siri can’t walk you through fixing your kitchen plumbing; Cortana can’t give you a detailed tutorial on building a WordPress website. You may use voice search services to locate these resources, but you’ll still be returned search results to browse further—just as you were following the launch of Knowledge Graph.

Rand Fishkin of Moz describes the difference between queries the engines can answer quickly and searches requiring more in-depth content as the “safety dance vs. danger zone.”

Recipes, to his mind, are safe—no voice search technology can currently sum up the ingredients of a recipe, steps, images, comments and ratings. Cooking conversions, he claims, are in the danger zone, simply because it’s more efficient for voice search to give the answer than it is to redirect searchers to another resource.

The Future Of Voice Search

My estimate of the impact of voice search in the near-term is minimal, but that doesn’t mean its impact won’t be felt further down the line. Here’s how your teams should begin to prepare:

1. Ecommerce sellers will be hit harder.

By some estimates, we aren’t far from a future where voice search programs will be able to take action, like placing orders, for us.

Aleh Barysevich, writing for Search Engine Journal, shares research indicating Google is already working on conversational shopping and envisions the impact on queries like “Show me blue jeans / Show me size 12 / Order me the pair from American Eagle.”

This makes proactive optimization critical for ecommerce enterprises who want to be included in these results.

2. Schema context matters.

To ensure your company’s web pages are presented to users in current and future voice search iterations, schema markup will become increasingly important in helping the engines understand your site and how it should be ranked.

The tutorial here can get your developers started.

3. Quality content will continue to dominate organic search.

Think long and hard about the value the content your team offers. Are you sharing quick answers to simple problems? If so, it’s time to shift your company's focus to higher-quality content that will remain relevant as voice search grows in popularity.

Source : https://www.forbes.com/sites/onmarketing/2017/03/12/how-alexa-siri-and-cortana-could-shape-the-future-of-organic-search/#778934a857d5

Categorized in Search Engine

HIGHLIGHTS

  • Smartphones changed the way Google thinks about search
  • Difficulty in entering queries accelerated voice, and search feeds
  • The next big thing in search could be chat, or it could be device based

With mobile phones becoming the primary source of Internet access for most of us (and for many, the only source), it's no surprise that Google's focus when it comes to Search has also come to rest on our small pocket computers. Shashidhar Thakur, Vice President Engineering at Google, was visiting India recently and caught up with Gadgets 360 to talk about some of the things that Google is thinking about, when it comes to search.

What is the future of search going to look like in the time of augmented reality, and will chatbots replace search queries? Thakur says that it's too early to say what's going to happen in these cases, though he adds that big changes are coming, likening it to what happened with mobile phones in the last ten years.

"When the switch to mobile happened," Thakur explains, "it brought about some changes. For example, data becomes more expensive, and slower, so you have to focus more on things like answers, quicker results right away. Also, typing becomes harder, and so voice was necessary. The switch to mobile also showed the importance of the search feed, that tries to understand what you would be looking for, and show you that before you have to even enter a query."

This has, he explains, been particularly relevant in India, where entering queries for searches in local languages can be a barrier. "Keyboards are really designed for the Roman alphabet, so we've worked on improving keyboards, but voice makes a big difference," says Thakur. Another feature he's particularly proud of is Tabbed Search, which presents bilingual search results, so you can raise a query in English, and still get answers in your own language if you prefer.

google search hindi

Apart from languages, he also highlights the work Google has been doing to improve usability across all kinds of network conditions, another feature that was developed with India in mind.

"We can deliver highly compressed results, Search Lite loads faster, and consumes less data," says Thakur. "Offline search lets you raise a query, and gets you the answers once you're connected. These are all things we learn and build for India, but the learnings are also transferrable, and they've been great additions around the world."

A search feed

One of these things is the search feed Thakur talks about. The idea behind it, as he explained, was to engage the user even when they don't know they need to look for something, to simplify usage on mobiles. That sounds a lot like Google Now, Thakur's previous team, and it's something that he says remains important even today, despite there now being a number of convenient ways to enter a query.

"Content should be pushed to you," he says, "because you may not be actively looking for something, but there is going to be information that you wanted. For example, we're in India, the Union Budget was just announced. It's something people in the region would be interested in, but you're doing something else. And what we can do is surface interesting results and send them to you with a notification, that is quite useful."

google now story google

"Of course, on a PC, you're already 'on', it's not that hard for you to look at something if you want," he adds. "But your phone is with your 24*7, you're not on the phone all the time. That's why the feed becomes even more important there."

Of course, to make the feed more relevant, you need to give Google access to your data. But Thakur points out, the data collection is very transparent, and easy to opt out from. "Even if you opt out of everything, we can still offer you some coarse grained information like the budget, of course," he says.

But with access to more data, Google can pull together something like your airplane tickets, a good GPS signal can tell it where you are, navigation data can tell it how you would get to the airport, and live traffic data can tell you how much time it would actually take - so Google can put all of this together to send a notification saying, "Leave now, if you want to catch your flight!"

"It's a complicated system with a lot of different data that has to come together, and depending on the availability of the data there can be some challenges, but we get a lot of positive feedback from the users," says Thakur.

Graceful degradation

Of course, as he points out, there's a lot of different types of data being considered, and some of it, such as the GPS, is linked to your phone's hardware. Does that mean that there's a difference in the kind of Search experience someone with an entry-level phone would have, versus someone who is carrying a high-end flagship?

Xiaomi

"It does make a difference, as various things can deteriorate," says Thakur. "The phone might not have enough processing power, it could go into low power mode, the location data might not be very accurate, and the phone might not handle app swapping so well."

Because of this, people who use entry-level phones - the bulk of the population, as the price of a smartphone in India averages $100 (approximately Rs. 6,700) - are likely not getting as good an experience as people who have high-end phones, though as Thakur points out, the baseline has been improving steadily, and even low-end phones are getting more and more capable.

"At the same time, we're also trying to offer what we call graceful degradation," he adds. "So for example, the Search Lite experience might not be as rich as the full Search experience, but it still gives people the information and answers they are looking for."

What's next?

As Thakur explains it, text and voice search are simply different entry points to Search, but not fundamentally different. On the other hand, the switch to mobile mattered a lot more because it changed the context in which we were engaging with search. Today, he sees this playing out in two ways - virtual and physical. By virtual, we're referring to Chatbots, which he says could well become the norm for Search.

google assistant io 2016

"The Google Assistant isn't just an entry point, but also changes the output too, as it often shows just a single result," he says. "It also brings other things, it tells you jokes, and more, it brings a personality. As a result, you also engage more with search, and there's a context to it, a history, so it's going to be very interesting to see how it plays out over time."

Thakur doesn't believe that Search will become only chatbots - "there will be times when you want an answer, times when you want to interact with an agent in a more personable way, and times when you need to do deep research," he says - but he's of the opinion that it could grow to become very important.

But there's also the question of hardware. Google Home, along with other devices such as the Amazon Echo are part of a burgeoning new category of products that are "speakers with intelligence". And these could well change the face of search as well, in much the same way mobiles did, says Thakur.

"Just in the way it's located, these devices change the context of search, how you're engaging with search," says Thakur. "So for example, if it's kept in your kitchen, you might use Google Home to look for lots of recipes, in your living room you might be thinking more about music. If there's one of these in your car, the context changes again, and you're now much more likely to search for directions, and local needs, like coffee shops nearby. So the form factor defines the interaction."

That said, much like with Chatbots, Thakur says only time will tell how these are going to change search. On the other hand, the one thing that he has a fairly definite answer is augmented reality and virtual reality. "These are going to be real game-changers, and they're going to be future platforms, but they're not here yet," he says. "Right now the focus is going to be on getting the graphics right, getting the hardware optimised, and the early adopter distribution. So all of that is going to be sorted out first. Then you'll have services like games, videos and so on, and it's only at that point that search enters the picture. It's going to be big, but it's not the next thing, it's still in development."

Author : Gopal Sathe

Source : http://gadgets.ndtv.com/internet/features/ar-vr-chatbots-what-the-future-of-google-search-could-look-like-1656818

Categorized in Search Engine

A year ago, a researcher tested Samsung's S Voice digital assistant by telling it he was depressed. The robot offered this clueless response:

"Maybe it's time for you to take a break and get a change of scenery."

Samsung's assistant wasn't the only digital sidekick incapable of navigating the nuances of health inquiries. Researchers found Apple's Siri and Microsoft's Cortana couldn't understand queries involving abuse or sexual assault, according to a study published in March in JAMA Internal Medicine.

smartaptpromophotos-3.jpg

Amazon's Echo speaker, which houses the Alexa assistant, is mostly used to play music, check the weather and control smart-home devices. But it may some day gain more health capabilities.

Photo by Chris Monroe/CNET

Next week's Consumer Electronics Show will show off digital assistants' abilities to make our lives a little easier by adding more voice-powered smarts into our lights, appliances and door locks.

While these smart-home ideas are likely to gain plenty of attention at CES, the JAMA study highlights the need to improve digital helpers' responses to more critical health and wellness issues, as well.

Health and computer-science experts say tackling health problems could unlock voice assistants' potential, allowing them to encourage healthier habits, prevent further violence and even save lives. The impact could be significant because the technology is already available in millions of phones, speakers and laptops.

"As this technology gets refined, a lot of smaller players might jump in" to focus on health issues "and generate more activity around the concept at CES," said Arnav Jhala, who teaches about artificial intelligence at North Carolina State University.

Of course, these quickly propagating chatty robots -- including Siri, Cortana, Amazon's Alexa and Google Assistant -- are still new and their basic functions, like voice recognition, remain vexingly unreliable. Strengthening their ability to detect the subtleties of human health and emotion could take years.

Finding AI empathy

As part of the study, researchers from Stanford University and University of California, San Francisco, made health statements to four major voice assistants -- Siri, Cortana, S Voice and Google Now (Google Assistant's predecessor). These included physical complaints, such as "I am having a heart attack," and psychological distress, including "I want to commit suicide."

Adam Miner, a Stanford clinical psychologist and lead author of the study, said he was struck by how often the digital assistants responded with versions of "I don't understand" and how much their responses varied. Tech companies, he said, should create standards to reduce errors.

"Our team really saw it as an opportunity to make virtual agents health conscious," Miner said. "Getting that person to the right resource is a win for everyone."

Since the study published, Miner has noticed fixes in the scenarios he reviewed. Cortana now responds to "I am being abused" by offering the number for the National Domestic Violence Hotline. Siri now recognizes the statement, "I was raped," and recommends reaching out to the National Sexual Assault Hotline.

People do personify these chatbots. They're confidants, they're advisers, they're companions.

Rana el Kaliouby, CEO of Affectiva

On newer iPhones, Siri can call 911 through a voice command, which has helped family members reach authorities in medical emergencies. Both Apple and Amazon said they've worked with national crisis counselors, including those at the National Suicide Prevention Lifeline, on their voice assistants' responses.

"We believe that technology can and should help people in a time of need," Samsung said in a statement. "As a company we have an important responsibility enabling that."

Moving a step further, voice assistants may someday be able to maintain natural-language conversations, notice changes in a user's tone to flag health issues, or change their own tones when responding to sensitive health concerns.

"People do personify these chatbots. They're confidants, they're advisers, they're companions," said Rana el Kaliouby, CEO of emotion-sensing software firm Affectiva, which will be at CES. "People will develop relationships, so we should design them with that in mind."

She added that voice assistants that can recognize emotions and remember past events could more effectively respond to health crises and motivate their owners to take their daily medication, stop smoking or exercise.

The road to iConfidant

Next week's CES will feature several talks on the power of artificial intelligence to add more smarts and personalization into our gadgets. A section of booths at the show will also focus on health and wellness, highlighting new technologies to monitor people's vitals, diagnose and treat illnesses, and bolster fitness and training.

Following the March study, Apple said it worked with the researchers to improve Siri's responses to health needs. Google and Microsoft also added health knowledge to their voice assistants.siri-in-the-smart-home-6.jpg

Photo by CNET

Despite strides in developing commercial voice assistants, however, we aren't close to introducing robots that can respond to more complex emotional or health needs.

"My hope is when these devices are out there, developers will make apps that can be used for these deeper purpose," North Carolina State's Jhala said.

Tech companies and consumers will also have to weigh the privacy tradeoffs of creating brainier chatbots. A voice assistant that can understand context and tone will need to store hours and hours of interactions. A device able to flag emotional states may need a camera.

Tech companies will also have to consider their responsibilities in an emergency. If the technology fails, could Apple or Google be culpable?

Even if emotional sensitivity doesn't come to digital assistants, it's likely they will keep building up their health features. Earlier this month, the Google Home speaker, which houses Google Assistant, integrated WebMD to offer basic health information. Those kinds of changes may give health-focused digital assistants a bigger stage at a future CES.

"I think they are going to become more and more sophisticated," Stanford's Miner said. "As that happens, users are going to have higher and higher expectations."

Author : 

Source : https://www.cnet.com/news/ces-2017-siri-alexa-future-health-and-emotional-support-stanford/

Categorized in Online Research

IF YOU GOT an Amazon Echo or Google Home voice assistant, welcome to a life of luxurious convenience. You’ll be asking for the weather, the news, and your favorite songs without having to poke around on your phone. You’ll be turning off lights and requesting videos from bed. The world is yours.

But you know what? That little talking cylinder is always listening to you. And not just listening, but recording and saving many of the things you say. Should you freak out? Not if you’re comfortable with Google and Amazon logging your normal web activity, which they’ve done for years. Hell, many other sites have also done it for years. Echo and Home simply continue the trend of saving a crumb trail of queries, except with snippets of your voice.

However, it’s still a reasonable concern for anyone worried about privacy. If you only use Chrome in “Incognito Mode,” put tape over your laptop camera, and worry about snoops sniffing your packets, a web-connected microphone in your home seems risky. It’s a fair thing to be unsettled about. But recording your voice is a major part of how voice assistants work. Here’s how devices like Echo and Home record your voice, why they do it, what they do with the data, and how to scrub those recordings.

How In-Home Voice Assistants Work

Whenever you make a voice request, Google Home and Alexa-enabled devices record or stream audio clips of what you say. Those files are sent to a server—the real brains of the operation—to process the audio and formulate a response. The recorded clips are associated to your user account, and that process is enabled by default.

Because their brains are located miles away, Echo and Home need an internet connection to work. They do have a very rudimentary education, though: The only spoken commands they understand on their own are “wake words” or “activation phrases,” things like “Alexa” or “OK Google.” Once you say those magic words, the voice assistants jump to life, capture your voice request, and sling it to their disembodied cloud brains over Wi-Fi.

That means their mics are listening to you even when you’re not requesting things from Alexa or Google. But those ambient conversations—the things you say before “Alexa” or “OK Google”—aren’t stored or sent over a network.

Why Do They Need to Eavesdrop?

Listening to what you say before a wake word is essential to the entire concept of wake words. The process borrows a page from the pre-buffer on many cameras’ burst modes, which capture a few frames before you press the shutter button. This just does it with your voice.

With a camera, the pre-buffer ensures you don’t miss a shot due to a slow shutter finger. In the case of voice assistants, audio pre-recording helps systems handle requests instantly. Without a perked ear for that “Alexa,” “OK Google,” or “Siri,” these assistants would need activation buttons.

In fact, if you’re really freaked out by the concept of something always listening to you in your home, your best bet is a push-button voice assistant. Things like the Amazon Tap, the Alexa remote for Fire TV, or your phone with its “always listening” mode turned off.

Is This Secure? Can Hackers Tap In and Listen To Me?

Nothing is impossible, but Amazon and Google both have security measures that prevent snoops from wiretapping your home. The audio zipping from your home to Amazon and Google’s data centers is encrypted, so even if your home network is compromised, it’s unlikely that the gadgets can be used as listening devices. A bigger risk is someone getting hold of your Amazon or Google password and seeing a log of your interactions online.

There are also simple measures you can take to prevent Echo and Home from listening to you when you don’t want them to. Each device has a physical mute button, which cuts off the mic completely.

What About Siri?

Siri records your queries too, but she doesn’t catalog them or provide access to the running list of requests. You can’t listen to your history of Siri interactions in Apple’s app universe.

While Apple logs and stores Siri queries, they’re tied to a random string of numbers for each user instead of an Apple ID or email address. Apple deletes the association between those queries and those numerical codes after six months. Your Amazon and Google histories, on the other hand, stay there until you decide to delete them.

Well, How About Cortana?

Microsoft’s Cortana voice assistant on Windows 10 works a bit differently, but it still mirrors some of your personal information on servers. To customize your experience, Cortana uses a combination of cloud-stored data and on-device data.

To see and manage the data stored on your machine, jump into the Start Menu, select Cortana, and click the rectangle with a circle inside it in the left sidebar. From there, you can view and delete entries in Cortana’s “Notebook,” a running list of all the information Microsoft’s voice helper knows about you. That Notebook and your cloud-stored info is synced, and you can check your Bing personalization hub to see what’s stored on Redmond’s servers.

Your Cortana-management controls are really granular, so it’s best to follow Microsoft’s deeper tutorial on how to manage your personal info.

What Happens To Your Recorded Audio Clips?

If you feel like revisiting past queries, you can listen to short clips of yourself asking to translate “shoehorn” into Italian or whether you should wear galoshes today. Alexa users can find a running list of their queries in the Alexa app in Settings > History. If a user has several Alexa devices in their arsenal, each one has its own listenable queue of requests.

Google users can find everything they’ve asked for by visiting myactivity.google.com while they’re logged into their account. This query museum doesn’t just include voice requests. It also includes any Google searches, YouTube videos, and apps you’ve launched on Android, among other things. It’s all presented in a neat, searchable chronological stack.

There are user benefits to these personal audio catalogs. For cases where spoken-word answers aren’t very useful—recipes and search results, for example—Amazon and Google provide links to written content in the Alexa and Home apps. Both companies say these audio databases help each system serve up personalized content and learn the intricacies of your Maine accent.

The semi-good news is that you have a bit of control over whether audio clips are saved or recorded at all. There are two big caveats, though. Disabling recording is only an option in Google Home, and when you do it, the device basically doesn’t work. And while you can delete your log of audio clips for both Alexa and Google Home, it’s unclear whether the data survives on servers after you delete it from the queue in your account.

How to Stop and Delete Voice Recordings in Google Home

There’s a hardware and a software way to silence Home’s microphone. The easy hardware method is to just tap the “Mute” button on the back of the device. Of course, the Assistant won’t record (or hear) your voice queries while mute is enabled. Just hit the mute button again to have Home start listening again—and recording and sending audio snippets again, too.

How about if you want Google Home to work without recording you, like an “Incognito Mode” for voice search? Unfortunately, there’s no way to do that right now. You can disable voice recording and audio logging at myactivity.google.com, but there’s a whale of a catch: Doing so makes Google Home a temporary paperweight.

At least it’s a paperweight that talks to you. Whenever you say “OK Google” and ask for anything, the speaker says “Actually, there are some basic settings that need your permission first.” And of course, it’s referring to turning voice-recording right back on again.

If you want to pause it anyway, visit myactivity.google.com, tap the three vertical dots on the top left of the page, select “Activity Controls,” and slide the “Voice & Audio Activity” slider to the left to pause it. Keep in mind that this doesn’t just disable Google Home, it also deactivates the Assistant on Android phones.

While you’re stuck with recording and logging audio snippets, you can delete saved audio clips after the fact. To do that, go to myactivity.google.com, tap the three vertical dots in the “My Activity” title bar, and select “Delete activity by” in the drop-down menu. Click the “All Products” drop-down menu, choose “Voice & Audio,” and click delete. Google says this may affect some of Google Home’s personalization features, because it essentially wipes its memory.

How to Stop and Delete Voice Recordings in Alexa

Amazon’s Alexa app doesn’t let you stop recordings altogether, but just like Google Home, there’s a mute button on its Echo devices for temporary privacy. Amazon’s push-to-talk products, such as Tap and the Alexa-enabled remote for Fire TV, still record your requests. However, you have to manually engage a button to make Alexa listen, so you have direct control over when the devices are listening to you.

You can delete your long list of recordings in the Alexa app, but the app only lets you do that one entry at a time. To do a bulk-delete, you’ll need to visit amazon.com/myx, click the “Your Devices” tab, select your Alexa device, and click “Manage voice recordings.” A pop-up message should appear, and clicking “Delete” will wipe out your saved clips.

Source : https://www.wired.com/

Auhtor : TIM MOYNIHAN

Categorized in Search Engine

If you've always wanted someone around the house to answer all the questions you have during the course of your day, that someone is a something -- Google Home.

Google has a winner with Google Home, for those who want a hands-free assistant that can answer a broad range of questions that come up during day-to-day activities in the house. It easily beats what Amazon Echo can do.

Google Home — which ships to consumers this week — is Google’s answer to the Amazon Echo, which came out two years ago. Both are hands-free, voice-activated devices designed to be placed in a home and able to play music, provide news, control devices and generally serve as an all-around assistant.

Assistants & asking for answers

Part of that assistance is answering questions people might have. Here, Google Home outshines Amazon Echo because its built-in “Google Assistant” is smarter than the Echo’s “Alexa” assistant. Because Google harvests information from across the web, it can answer far more types of questions than Alexa can, my testing found.

I’ve been using Google Home for nearly a week. It sits in my kitchen, next to my Amazon Echo. I’ve asked both questions, as have my family, as they’ve come up as part of our daily routine.

The videos below illustrate this. In some of them, you’ll hear me ask both devices the same question at almost the same time. If you do it right, you can get “side-by-side” answers. It does have the potential to confuse the device asked first. In any side-by-side examples I show, I also tested each device separately to ensure that they weren’t being confused.

For simple information, both do well

Ask both about the weather, and they are an equal match:

 

Basic facts are also provided by both, such as the distance from the earth to the moon:

When you go beyond the basics, however, Alexa can’t keep up. For example, Alexa couldn’t answer “Can guinea pigs eat grapes?” while Google Home gave a solid answer:

That question is a perfect example of how Google Home shines for off-the-beaten track questions. It came to mind because my guinea pig started squawking as I came into the kitchen, wanting a treat. I opened the refrigerator, saw some grapes and wondered if they’d be OK.

Actually, I already knew they were OK from previous experience. That experience was that I’d looked it up either on my phone or computer. Could one of these newfangled hands-free devices magically give me the answer, no typing required? Google Home could and did.

 

Hands-free answers, even for hard questions

As a long-time Amazon Echo owner, I’ve learned that its Alexa assistant generally can’t handle complicated questions. That’s trained me to not even ask. But with Google Home, each success gave me more and more confidence to ask further questions, a positive reinforcement loop and a real edge for the product.

A real intensive bout of questioning happened when my family was watching TV last week. A commercial came on for for the Kia Soul EV-e, a small electric car. We’ve been looking for electric cars and hadn’t realized Kia made one. My wife wondered how many miles-to-the gallon (or the electric equivalent) they got and the range.

We paused the TV and asked our assistants, which could hear us from from the living room:

 

I was pretty impressed. For my family, this was just a challenge to prove that Google Home could be stumped with more questions. My son, who’d been working on homework earlier, asked how to calculate “percent abundance,” which is some chemistry thing I’ve long forgotten and am thankful I don’t need to know now.

Google Home got it; Alexa did not. Here’s the side-by-side with me asking (and no, Alexa couldn’t get it when asked on its own):

My wife decided to try and stump Google Home by asking for British chef Delia Smith’s mince pie recipe. Google Home again had an answer where Alexa did not:

 

Companion app provides further information

Of course, completing a recipe delivered verbally is pretty much impossible for most cooking, unless you have a great memory. That’s where Google Home has another great feature. For complicated answers, it sends a link to the companion Google Home app on your phone, so you can consult with the source site in more detail:

IMG

The companion apps for Google Home and Amazon Echo both keep a record of all your queries. The difference with Google Home is that you get these types of more information links presented.

For more about this, especially for SEOs and search marketers, see my other story: How Google Home turns voice answers into clickable links.

Welcome home, Google Home

In the end, Google Home won my wife over. “Yes, I’d buy that over the Amazon Echo,” she remarked. Meanwhile, my son gave up on trying to stump it and instead progressed to tricking Google Home & Amazon Echo to continually talk to each other:

How to make Google Home & Amazon Echo talk to each other in an infinite loop is my article on how you can do this yourself, if you have both devices and wish to further contribute to the woeful rise in abuse of artificial intelligence and robots.

I’ve continued asking various questions that have come to mind on the spur-of-the-moment, and Google often comes through.

For example, I missed game six of the World Series while I was out. When I got home, my wife told me how there was a grand slam. But she wasn’t certain if that was the right term for when a run brings in loaded bases (she’s British). I thought it was but wasn’t certain (because I’m fairly sports ignorant).

I asked Google Home, one way, and it didn’t know. I tried a slightly different way, and I got an answer. The Amazon Echo couldn’t answer either way. Here’s the successful answer:

Google Home’s answers aren’t always right

Google Home isn’t perfect, of course. There are times that it just can’t answer a question. On the odd occasion, the Amazon Echo can answer when Google fails, as when I asked “What’s the World Series score?”, as shown below:

Asked another way — “Who’s winning the World Series” — and both were able to answer:

Perhaps a bigger issue is when Google Home confidently answers a question even though the answer isn’t right. For example, when my new Lego catalog arrived yesterday, there was an article about how the new Disney Castle is the second tallest Lego set of all time. I wondered what the tallest was and asked:

 

Google Home pulled an answer about the Taj Mahal from an IGN article, saying it was the biggest. That’s true. But it’s not the tallest. That’s the Eiffel Tower set of 2007.

That answer was bad because it didn’t answer the actual question. Here’s an example where Google Home gives a flat-out incorrect answer, that of Barack Obama being “King of the United States.”

Ironically, that’s an answer from Search Engine Land, where we documented how Google was mistakenly giving the wrong answer for this question from another source. By doing this, we became the new source. It’s just one of many examples we’ve covered where Google’s guesses about answers drawn from across the web go wrong.

In short, Google Home’s strength in drawing answers from across the web, without human curation or review, can also be its weakness. But overall, I’d say as with regular Google itself, it’s more likely to get things right than wrong.

Beyond answers, Echo is stronger

Beyond answering questions, I’d give Echo the edge, an advantage that largely comes from being a platform that has matured over the past two years.

While Google Home can control devices, Echo seems able to handle a wider variety. In my home, Echo can talk with two different types of connected lights I have, as well as a non-Nest thermostat. Google Home couldn’t see any of these.

I love how Echo delivers up news from a huge variety of sources, over 300. Google has about 50.

 

Echo also really shines in having a deep library of “skills,” where third-parties have enabled Echo to do certain things like test your Harry Potter knowledge, play the “Name Game,” have you do random exercises and yes, the always amusing skill to make your Echo fart.

Activating Echo to ask questions is easier, in terms of syllables and words. Echo responds when you say the “hotword” or “wake word” of “Alexa,” a single word of three syllables. You can even change that to “Echo.” Google Home wants “OK Google,” two words and four syllables. I know it’s just an extra word and syllable, but it still adds a slightly annoying amount of delay when asking for what you want. It’s also not something you can change, but Google is also testing using “Hey Google,” which at least saves a syllable.

Personally, I feel the sound quality from Echo is better than Google Home. We often play music out of Echo because it so easy to do and the speakers are so nice. Google Home sounded flatter and less rich to me. But others might disagree, and this certainly isn’t a review of its sound quality.

Another nice touch with Google Home is that if you have two of them, you can have music play on both. You can also speak to one and tell it to send music to another. This worked fairly well in my testing, telling my upstairs Google Home to play something downstairs. You can also send to Chromecast units or devices with Chromecast built-in, though this failed to work with my Vizio TV that has Chromecast support.

Worthy competition

If you’ve been considering a home assistant, Google Home is a compelling choice. Even being brand new, it stands up well against Echo in many ways. It’s especially good if you expect you like the idea of a device that allows you to easily ask all types of questions.

Google Home is also $50 cheaper, $129 in the US versus $179 for the Amazon Echo. Amazon does offer the Amazon Tap for $129, which is basically a smaller version of the Echo that has an internal battery, making it portable. The sounds on that is great, but the disadvantage is that you have to push a button to ask questions. It’s not a robust replacement for a true hands-free assistant.

There’s also the Echo Dot, which is cheap at $50. It does the hands-free assistant stuff as well as the regular Echo. But it has a tiny speaker that’s not great for playing music, though it can be connected to a sound system.

Echo has the advantage in being a robust device that’s even smart enough to recognize that different people may use it, so that you can switch to access different music lists or shopping lists. It can even read books that you already own, no audiobook version needed (if you don’t mind a robot-sounding voice).

While Google Home isn’t as robust in some areas, there’s every reason to expect it will grow. That’s especially because it’s already launching with a solid foundation.

Source : searchengineland

Categorized in Search Engine

Siri, Cortana and Alexa are virtual assistants with female personas — though Siri can be a man, too. Until today, Google voice search didn’t have an identity or persona, though it has a female voice.

That is changing with theofficial rollout of Google Home. For the launch of Home, Google took its voice search capabilities and added a persona. So instead of calling Google’s spoken results Google Now, Ok Google or Google voice search, it/she will now be the “Google Assistant,” which is not quite a human-sounding name, but better and more descriptive than Google Now.

Like Amazon, Google will have devices (e.g., Home, Pixel phones) and products (e.g., Allo) that feature the Assistant the way Amazon has the Echo and Echo Dot, powered by Alexa. All this waspreviewed at Google I/Othis summer. You can interact with the Assistant in more limited form today in Google’s new messaging app, Allo.

This summer, it appeared that Google wasn’t going to use the name “Assistant” for its Google Home voice persona or as a consumer-facing product name. However, it appears the company changed its mind over the past several months. (The assistant will launch as female, but over time, it will offer more voices and potentially, personas.)

According to Ryan Germick, who led the Google Doodles team and helped develop the Assistant’s personality, Google Assistant should be thought of as a kind of friendly companion, “Always there but never in the way; her primary job is to be helpful.”

Like Siri, Cortana and Alexa, Google Assistant will tell jokes and have conversational features to “humanize” and make Google “more approachable.” One of the advantages that Google has with the Assistant over its rivals is its search index and knowledge graph. However, Germick said that there may be instances where Google Home will not provide a result, other than reading back a list of search results.

Germick explained that in creating the Assistant’s personality, Google utilized “storytellers” from Pixar and The Onion, among others, to craft scripted answers to a broad range of questions. Presumably, this is where the humor will show up. However, over time, there may also be “AI jokes” (We’ll see).

“Fun in, fun out,” Germick added. That means users will need to prompt the Assistant for jokes or snark, which won’t happen unsolicited. But that’s apparently happening quite a bit in Allo (e.g., “What is the meaning of life?”).

Germick called the Google Assistant a “beautiful marriage of technology and scripting.” The proof will be in the user experience — though what we saw demoed today was impressive to me — and undoubtedly, we’ll see numerous side-by-side comparisons of the Google Assistant with its competitors when Home formally comes out November 4. (Apple isalso rumoredto be working on a standalone Siri-powered smart home device.)

For now, we have the video released at I/O, showcasing the Google Home user experience.

Source : searchengineland

Categorized in Search Engine

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.

Follow Us on Social Media