When we think about flying cars, an idea that goes back more than 50 years, we often think of awkward technical concepts. Where does one stow the wings when driving? How does the designer efficient.y handle the propulsion for both roads and air? Airbus has come up with an ingenious solution, brilliant in fact. The autonomous drone comes and lifts the passenger module away. Digital Trends has the story and the demo video. It’s still just a concept, and a real product is 7 to 10 years away, according to Airbus. And then there’s the issue of FAA regulations even if it’s autonomous in all modes. Could be very cool. And no more sitting idle in rush hour traffic.
Apple reignited interest in its autonomous car project with a letter to Federal regulators arguing that “new entrants” into the autonomous vehicle industry should have just as many rights as the established automakers when it comes to testing prototypes on public roads.
We tend to think of robots and AI agents as potentially threatening. But when they’re specifically charged with protecting the human passengers in autonomous cars, there could be some serious shenanigans by aggressive drivers. Even abuse. What if one of those autonomous cars, in turn, does something unexpected? John looks at a mind-numbing scenario.
A major problem with Artificial Intelligence (AI) development is that a time might come when AI’s are able to learn and teach themselves faster than humans can manage them. Recently, President Obama suggested AI’s that aren’t properly constrained and regulated could be unleashed on unsuspecting citizens and severely disadvantage them. Figuring out when to step in will be the great 21st century challenge for governments.
The time since most of the Macs have been updated can now be described as geologic. Is that because Apple doesn’t care about the Macs? Or, more likely, could we be in for another major architectural change? Evidence is mounting that Apple will abandon Intel and take the Mac lineup to ARM. John looks at the evidence and makes the case.
Much has been written now about the moral guidance for autonomous cars and trucks. It’s a difficult problem that involves quantifying then instantiating into software the logic of life and death decisions. It would be nice for society to have more time to ponder, but the pace of technology leaves us precious little time for that. Machines are going to make moral decisions very soon. Shall we let them?
One of the most important issues with the autonomous driving cars of the future is the partitioning of liability. To that end, new legislation proposed in Germany would require a data recorder to log when the car is under autonomous or driver control to aid in the assignment of responsibility. But such a box has privacy considerations. And it might be hacked. Would such a data recorder deter buyers? Could Apple overcome all this?