Cars that tell people where to go

A three-year Macquarie University research project has studied the differences in how humans and computers talk - by analysing how each give driving directions.

Professor Robert Dale, Director of the Centre for Language Technology at Macquarie University and his team first studied the language used by existing car navigation systems, handheld counterparts and websites like whereis.com.

"We very quickly realised that the way these devices give you instructions is nothing like what people do," he says. "What you get from the computer is 'go straight 300 metres, turn left, go straight 500 metres, turn right'. If you ask a person the same route they'll say something like 'well, you go along the road to just past the second traffic lights, then take a left, and then you go over a couple of speed bumps and you're there'."

Through a series of experiments, Dale and his team identified three key differences between computer and human directions, and built a machine that used much more 'humanistic' descriptions.

"Firstly, people tend to use landmarks rather than distances in their directions," he says. "Very rarely will people say 'go straight for 300 metres', they're much more likely to say 'go along the road until you see the flagpole' or 'until the white gate'. Of course, these machine systems so far don't have very many landmarks built into them. That's changing, particularly in the US where you find companies like McDonalds are funding the addition of the locations of their stores to these datasets, so that you can now get a route description that says 'follow Highway 1 until you pass the McDonalds'."

The second major difference is that people don't speak in the short, staccato sentences used by computer systems. Dale and his team analysed how humans described things and how they followed routes in order to establish how best to package directions into more human-friendly 'chunks'.

"The third thing that's different about what people do is they miss things out, because they're obvious," says Dale. "So we also did some work on how we could get the machine to more intelligently decide what information to drop from descriptions."

Although they now have a working web-based system which sounds much more like a human giving directions, the researchers have one unanswered question.

"The one thing we don't have a clear answer to is whether the more naturalistic descriptions are necessarily better than the very tabular, boring, machine descriptions," says Dale. "We have a real concern that people will already be so used to the 'turn left, turn right, turn left, turn right' system that they won't accept anything other than that."

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News-Medical.Net.
Post a new comment
Post
You might also like... ×
Arrayjet secures new contract to provide SciLifeLab with microarray technology for proteomics research