Home / Software & Service News / You may one day be able to buy a Jaguar…steering wheel

You may one day be able to buy a Jaguar…steering wheel

Will any of us actually own cars in the future? If Uber and Lyft pave the way, we’ll keep using apps (or chatbots) to summon a car owned by someone else. And, if Jaguar Land Rover is right about the future, we won’t own the car but we might own the steering wheel itself.

In a concept called Jaguar Future-Type, the British car company has attempted to show what the future will look like. The steering wheel could sit in your kitchen or by your bed. You might use it for controlling your music and setting navigation waypoints — a steering wheel that looks like a Bluetooth speaker.

It looks like this:

When you want to drive, you’d “summon” the actual autonomous car, one that might be owned by a group of people or maybe by a fleet service. You’d attach your own steering wheel if you want to drive or to enable your own settings. While the concept is a good stab in the dark in trying to explain what driving will be like in 2040, there are a few flaws in the design scheme.

First, a bit more about the interface itself, called Sayer. (It’s named after Malcolm Sayer, the Jaguar car designer who died in 1970 of a heart attack.) As with most concepts, the actual details are a little fuzzy, but the idea is that you can talk to the steering wheel similar to a voicebot like Siri or Amazon Alexa. You might ask Sayer which parts of the road are ideal for driving yourself (e.g., curvy, nice vistas) or which parts are better left to a robotic driver (e.g., boring, traffic, highways).

Here’s what I like. We will definitely talk to cars. We will summon them, and we probably won’t own them. Why would we? Transportation itself will change dramatically by 2040. We might use an Uber, or a personal car that drives itself, or maybe an autonomous bus. Who knows? Maybe the Hyperloop will replace most common forms of transport.

The part that doesn’t seem likely is owning only a part of the vehicle. We’re already living in the cloud, so our data and navigation preferences are not tied to a particular device…why would they be in 2040? I already use Apple CarPlay today, which is tied to my phone but doesn’t really need to be tied to anything. (My navigation preferences are all stored in an app.)

Also, my theory about voice interfaces is that they will become incredibly prevalent. We won’t talk to a phone or a speaker. We’ll talk to Alexa everywhere. Bots will be so ubiquitous that we will be able to talk to them in an office, the kitchen, and the living room. Someday, a new home will be constructed with Alexa (or some other bot) activated throughout every room. We won’t even know or care where these bots are loaded physically onto a hardware product.

It’s also a bit ridiculous to think a steering wheel would be the device we carry around. It looks heavy and bulky. Instead, biometrics in my office will identify my voice. When I summon an autonomous car, it will adjust itself to my preferences. I won’t talk about navigation. An AI will already know more about my day and my schedule than I can remember myself. It will take me to my destination automatically, based on machine learning algorithms not based on what I told a steering wheel in my living room. If an AI determines that I’d like to drive on a curvy road, it won’t need to talk to me about it. It will offer to let me drive, knowing my preferences.

All of this to say — bots will live in the cloud. We’ll activate them without thinking about where they are located or which speaker is in an office. They will know our preferences, so it won’t be like the voicebot interactions we have to day that are so scripted and basic. If anything, Sayer is what I hope will not be viable in 2040, despite how much I like the car design.

Click Here For Original Source Of The Article

About Ms. A. C. Kennedy

Ms. A. C. Kennedy
My name is Ms A C Kennedy and I am a Health practitioner and Consultant by day and a serial blogger by night. I luv family, life and learning new things. I especially luv learning how to improve my business. I also luv helping and sharing my information with others. Don't forget to ask me anything!

Check Also

Microsoft’s Seeing AI app for the blind now reads handwriting

Artificial intelligence took center stage at Microsoft's AI Summit in San Francisco on Wednesday. Aside from announcing AI smarts for a range of software -- from Bing to Office 365 -- the tech titan is also ramping up its Seeing AI app for iOS, which uses computer vision to audibly help blind and visually impaired people to see the world around them. According to Microsoft, it's nabbed 100,000 downloads since its launch in the US earlier this year, which convinced the tech titan to bring it to 35 countries in total, including the EU.

It's also getting a bunch of new features. The app now boasts more currency recognition, adding British pounds, US dollars, Canadian dollars, and Euros to its tally. Going beyond the color in a scene, it can also spot the color of specific objects, like clothes. Plus, it's no longer restricted to just short printed text, with handwriting recognition now part of its skill set. You can also customize the voice that it uses to speak its observations out loud, and set how fast it talks.

Finally, a musical light detector alerts you to the light in an environment with an audible tone -- Microsoft claims the tool will save users from having to touch a hot bulb or LED battery to check if it's on. Despite the big update, there's still no word on an Android launch.

Source: Microsoft