Wednesday, January 9, 2019

Canada updates Drone regulations



So drones are all in the news lately because of incidents at Gatwick and Heathrow airports.

 Today (09 January 2019), the Canadian government announced new regulations for drone operators.

https://www.cbc.ca/news/politics/drones-aviation-garneau-regulations-1.4970750


The federal government has adopted strict new regulations to govern the use of drones in Canadian airspace — prohibiting them from flying near airports and emergency scenes, and ensuring those operating them aren't drunk or high on drugs.

Speaking to reporters at a news conference in Montreal Wednesday, Transport Minister Marc Garneau said drones have the potential to improve lives but can also present security problems.
"The government is resolved to improve the security of aviation and of the public. At the same time we are also resolved to encourage and support the possibilities of innovation and economic growth that drones represent."

The new regulations are comprehensive. They require that drones be registered and that operators of larger drones be certified. The regulations also state who can operate them, where they can fly and what they can carry.


New regulations atre probably a good thing but before something goes terribly wrong - there is going to have be some serious effort towards enforcement. "Drone Force" perhaps - tongue firmly in cheek.
It seems likely that airports are going to need some sort of militarised protection be it interference 'shields' or weapons.




Thursday, October 25, 2018

Administrative Law and AI Accountability


Different countries have different approaches to Administrative Law. Some countries have little specific and rely on courts to determine on decision making correctness. Canada seems to work much like this. Australia even though it is a common law country has a specific laws regarding decision making, a separate court system in the Administrative Appeals Tribunal and in many states permanent independent commissions to watch over public bodies examining evidence of corruption.

What then with Big Data and AI decision making. We know humans are biased we know that we will build those biases into the technology we build - that is the whole point of regulation.

A few jurisdictions have now begun to venture into regulating AI decisions.

However, we know that that is going to be very difficult. Watching the Alphago documentary was fascinating. In real time the Alphago team could has questions of the AI such as probably of a human making the movie. But getting the AI to actually say why it made that move - more challenging. Humans have intuition - perhaps after playing thousands of games Alphago had intuition.




From the Economist 2018-02-17

As The Economist discussed back in February there are approaches to the problem such as Explainable AI. But that only gets you so far.

The real problem is that good AI may not be self explanatory.

Machine learning works by giving computers the ability to train themselves, which adapts their programming to the task at hand. People struggle to understand exactly how those self-written programs do what they do (see article). When algorithms are handling trivial tasks, such as playing chess or recommending a film to watch, this “black box” problem can be safely ignored. When they are deciding who gets a loan, whether to grant parole or how to steer an car through a crowded city, it is potentially harmful. And when things go wrong—as, even with the best system, they inevitably will—then customers, regulators and the courts will want to know why.

For some people this is a reason to hold back AI. France’s digital-economy minister, Mounir Mahjoubi, has said that the government should not use any algorithm whose decisions cannot be explained. But that is an overreaction. Despite their futuristic sheen, the diô€‚˜culties posed by clever computers are not unprecedented. Society already has plenty of experience dealing with problematic black boxes; the most common are called human beings. Adding new ones will pose a challenge, but not an insuperable one. In response to the flaws in humans, society has evolved a series of workable coping mechanisms, called laws, rules and regulations. With a little tinkering, many of these can be applied to machines as well.


The Economist may be overly optimistic - in the first place that governments or courts will even get around to looking at AI but secondly the speed of change is overwhelming.

However, The Economist does have a point humans sometimes maybe even often can ot explain their decisions. That makes the New York City's legislation so interesting.

Quoted from MIT blog


New York City has a new law on the books demanding “algorithmic accountability,” and AI researchers want to help make it work.

Background: At the end of 2017, the city’s council passed the country’s first bill to ban algorithmic discrimination in city government. It calls for a task force to study how city agencies use algorithms and create a report on how to make algorithms more easily understandable to the public.


Rubber, meet road: But how to actually implement the bill was left up for grabs. Enter AI Now, a research institution at NYU focused on the social impact of AI. The group recommends focusing on things like making sure agencies understand the technology better, and providing a chance for outside groups to look at algorithms.



https://www.technologyreview.com/the-download/610346/the-big-apple-gets-tough-on-biased-ai/

Link to the text of the New York City legislation.

A good article about the legislation development is here at The New Yorker 

Thursday, March 29, 2018

Autonomous Vehicle Readiness Index - KPMG

Earlier in the year KPMG came out with a report on autonomous vehicle readiness. The recent tragic death of pedestrian by an Uber AV just reinforces that technology is not autonomous, determined and needs profound governance.

The pace of development of AVs is breathtaking. A year ago, some would have argued that they would never become a reality. But now, AVs are being piloted in a number of countries and are running on public roads, albeit only in a handful of locations such as Phoenix in the US State of Arizona and in Singapore. The question is no longer whether but when all road vehicles become fully autonomous. And whether you believe that will take 10 years or 30, the implications are so far-reaching that policymakers need to start planning now for our AV future.




https://home.kpmg.com/xx/en/home/insights/2018/01/2018-autonomous-vehicles-readiness-index.html


Monday, February 26, 2018

Delivery Robots

Quoted...

Well, this seems ironic. It may be home to some of the most innovative tech companies on Earth, but it appears that San Francisco has an aversion to robots. The San Francisco Chronicle reports that the city has just put draconian restrictions on multi-wheeled delivery bots—such as those made by Starship Technologies, pictured above—that are being tested on the city’s sidewalks to carry food and packages to customers.

San Francisco’s Board of Supervisors says that only nine such robots can operate across the city at any time, and companies can have no more than three robots each on the city’s streets. There’s more: the robots will be largely limited to streets in industrial areas, must not travel faster than three miles per hour, and must be under constant human supervision.

Now, a city swarming with experimental robots doesn’t sound like a great idea. Nobody wants armies of half-baked bots rolling unsupervised through densely packed streets causing accidents. But nine machines across an entire city? That seems rather measly, and forbidding companies from testing their robots in residential areas means they can’t gain valuable experience in their key target market.

(Robot makers can, however, take some small comfort from the fact that one supervisor, who wanted to ban the robots from San Francisco altogether, didn’t carry the day.)

https://www.technologyreview.com/the-download/609730/san-francisco-is-really-really-worried-about-robots/

Thursday, February 22, 2018

Drone regulation



If you're a drone operator in Australia, you should brush up on your safety regulations or face being automatically grounded from today. 

As of February 14, pilots of DJI drones — the most popular brand in the country — will be required to pass a short "knowledge quiz" about safety in order to fly their machines.

The quiz is embedded in a mobile phone app which controls the drone, meaning every DJI pilot will be forced to take the three-minute test, even if they are an experienced operator.

There is no limit on the number of times a pilot can attempt the quiz.

Peter Gibson, corporate communication manager at the Civil Aviation Safety Authority (CASA), which has authorised the move, says DJI approached the agency to help design the nine-item test.

http://www.abc.net.au/news/2018-02-14/australian-dji-drone-pilots-forced-to-take-quiz-to-fly/9443712?section=technology


Wednesday, February 21, 2018

Hacking Tractors (& Legislation)



The days of home tractor repair are coming to an end with machinery technology and tightening intellectual property restrictions meaning farmers are forced to pay big bucks to fix their machinery.
When Nebraska farmer Tom Schwarz bought a tractor he did not realise he would be bound to his John Deere dealer who holds onto intellectual property rights to fix it.


"When you paid the money for a tractor, you didn't actually buy the tractor … because all of the intellectual property is still theirs," Mr Schwartz told tech journalist Jason Koebler in a documentary released earlier this month."You just buy the right to use it … for life."



Farmers and independent machinery repairers across the United States are now campaigning for the right to fix their own machinery.



.....


In Nebraska, a "fair repair" law is being proposed to allow farmers to repair their own tractor.
If successful, the Right to Repair Act would make it mandatory for companies to disclose their diagnostic software and sell parts.



http://www.abc.net.au/news/rural/2018-02-22/tractor-hacking-farmers-in-the-us-fight-for-right-to-repair/9470658

Wednesday, December 20, 2017

Uber not a technology company



Uber is a transport services company, the European court of justice (ECJ) has ruled, requiring it to accept stricter regulation and licensing within the EU as a taxi operator.
The decision in Luxembourg, after a challenge brought by taxi drivers in Barcelona, will apply across the whole of the EU, including the UK. It cannot be appealed against.
Uber had denied it was a transport company, arguing instead it was a computer services business with operations that should be subject to an EU directive governing e-commerce and prohibiting restrictions on the establishment of such organisations.