From Literature to Living Rooms: Perceptions of Robots in Society

As drones have become increasingly accessible, media outlets have been preoccupied with news stories that fuel our fears about the prospect of privacy invasion and physical harm. Although drones have only recently become mainstream, society has endured a long-held fixation with the need to regulate robots in order to save itself from coming into harm’s way. This dystopian view of robots originates in Golem literature and the romantics. In 16th Century Jewish literature, Rabbi Loew of Prague created the Golem, a creature constructed from clay to protect the community from being expelled by the Roman Emperor. Rabbi Loew would deactivate the Golem on Friday evenings in preparation for the Sabbath. One Friday, the Rabbi forgot to deactivate the Golem, and it became a violent monster that needed to be destroyed. A similar theme emerged in Marry Shelley’s Frankenstein, in which a man-made monster turned against its creator.

The blueprints outlined in Golem literature and the romantics were further refined in the realm of science fiction. Writing just prior to the advent of the modern robotics industry, Asimov advanced three laws to negotiate the dangers associated with the introduction of robots into society proper. Asimov’s Three Laws of Robotics provide that:
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm;
2. A robot must obey orders given to it by human beings; except where such orders would conflict with the First Law; and
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Asimov later added a Zeroth Law that would supersede the Three Laws: a robot may not harm humanity, or by inaction, allow humanity to come to harm.

I posed the following question to Tony Dyson, designer of R2-D2, the brave and lovable droid who many perceive as the true hero of Star Wars: As robots become increasingly autonomous, do you think we will need Asimov’s laws? Here is what Dyson had to say:

I would love to say yes, all intelligent machines (autonomous robots) that are programmed to think for themselves, must also have an overriding ‘hard wired’ set of rules to work with. These should not be guidelines, but must be a set of laws, clearly defined by the ruling body. However the practical problem is, as Rodney Brooks, co-founder of iRobot has alluded to: ‘People ask me about whether our robots follow Asimov’s laws. There is a simple reason [they don’t]: I can’t build Asimov’s laws in them.’

So we ask the question, do we face any danger from robots without Asimov’s laws? I don’t see our AI research progressing into ‘Skynet Terminator’ anytime soon, but I may be just saying that, as part of my evil plan – there is a good reason why I share the same name as the ‘Head Robotic Scientist’ in the film Terminator.

Why do we fear robots? The term robot comes from the Czech word robota, which means forced labour. Simply put, we create robots to serve and fulfill our needs. However, advances in artificial intelligence are bringing us closer to achieving autonomous robotics. If and when robots become truly autonomous, we fear that they will no longer serve us – or worse that they will turn against us and destroy us. The consequence of our fear of robots is that we will systematically resist technological advances that may prove beneficial. The debate is yet to be settled on whether robot surgeons will err less frequently than their human counterparts, or whether driverless cars will decrease the number of accidents on our roads. The point is that if we resist these advances, such questions will remain unanswered.

How can we move forward and change our perceptions about robots? In Japan, robots are highly integrated into society and this may have something to do with the different cultural outlook on human-robot interaction. For instance, in 2007, Japan’s Ministry of Foreign Affairs designated Astro Boy as the nation’s envoy for safe overseas travel. In North America, Hollywood could play an important role in shaping positive attitudes towards consumer drones and robots.

Earlier this year, Clive Thompson published an article in the Smithsonian titled “Why Do We Love R2-D2 and Not C-3PO? Thompson explored how the design of robots impacts our reaction to them, arguing that: “R2-D2 changed the mold. Roboticists now understand it’s far more successful to make their contraptions look industrial—with just a touch of humanity. The room-cleaning Roomba looks like a big flat hockey puck, but its movements and beeps seem so “smart” that people who own them give them names.” And it appears that Hollywood does in fact inspire robot makers… Co-founder of iRobot, Helen Greiner recently posted a note on Dyson’s LinkedIn profile, stating: “Because of Tony’s compelling emotive design, I fell in love with R2D2 when I was 11. This enabled my whole career in robotics from attending MIT to cofounding iRobot, the company that makes the Roomba vacuuming robot. I hope you see a little of R2D2 in your Roomba!”

Transport Canada Releases New Framework for UAV Operations

Earlier this month, Transport Canada announced that commercial operators will soon be able to benefit from two exemptions from the general Special Flight Operations Certificate (SFOC) requirement. Today, Transport Canada published an infographic on its site detailing the new exemptions. Here’s a breakdown of the new framework…

When do you need a SFOC?
A SFOC is required if you are operating a UAV that weighs more than 35kg for any purpose. Operators must also obtain a SFOC if they are flying for “work or research” purposes and they do not meet the requirements of either of the new exemptions (for instance, if the UAV weighs more than 25kg). Before outlining the exemptions, it is important to highlight that the “research” criteria reflects an expansion of the general SFOC requirement.

When can you avoid a SFOC?
If a UAV is not being used for “work or research” and it weighs 35kg or less, a SFOC is not required, however operators are still expected to engage in safe practices. Transport Canada has enumerated “safety tips” such as: flying during daylight, within sight; avoiding airports, populated areas and moving vehicles; and not exceeding an altitude of 90 meters.

Those operating UAVs for “work or research” may be able to benefit from two exemptions. The first of these applies to operators flying UAVs weighing less than 2kg. The requirements that the operator must satisfy to qualify for this exemption include: age restrictions, carrying liability insurance, flying during daylight in direct line of sight, and flying at a distance of at least 30 metres from people, animals, buildings and vehicles not involved in the operation.

The second exemption applies to operators flying UAVs weighing between 2kg and 25kg for “work or research” purposes. This exemption features a more stringent spin on the requirements that apply to UAVs under 2kg (i.e. staying at least 150 metres away from people, animals, buildings and vehicles not involved in the operation). Additional criteria includes: developing and adhering to landing and recovery procedures and having a fire extinguisher on site.

Reflecting on the new exemptions, Brendan Schulman, a New York attorney representing commercial operators commented that “Canada’s new regulatory framework reflects a thoughtful risk-based approach and recognizes that at low weights and low altitudes, commercial drones do not pose serious safety risks. I hope our regulators in the United States take note of this alternative path to the future regulation of commercial drones.”

Transport Canada’s Exemption Notification Simultaneously Signals a Relaxation of Commercial Drone Regulations and an Increased Focus on Academic Use

Last week, Transport Canada announced upcoming changes to the regulatory framework surrounding drones. As of the end of this month, commercial operators could benefit from two exemptions from the general Special Flight Operations Certificate (SFOC) requirement. First, an SFOC will no longer be required for commercial operators flying UAVs under 2 kilograms. And second, commercial operators conducting low risk operations using UAVs under 25kg will also be exempt from the SFOC requirement. Transport Canada has yet to release information regarding the factors that will determine whether an operation falls into the parameters of the second exemption.

A closer look at the Transport Canada announcement suggests additional implications are on the horizon. Here is the relevant statement:

“Until the new requirements come into effect, you must apply for a Special Flight Operations Certificate if:
1. Your aircraft weighs more than 35 kilograms (77 pounds).
2. You use your aircraft for work or academic purposes (such as aerial photography, geomatic surveying, crop observation, advertising, research and development).”

The academic purposes concept seems to be a new addition to the SFOC requirement. Let’s trace the evolution of the terminology employed in determining whether an SFOC is required…

Transport Canada enforces the Canadian Aviation Regulations (CARs) and the Aeronautics Act. Subsection 101.01(1) of the CARs defines a UAV as a “power-driven aircraft, other than a model aircraft, that is designed to fly without a human operator on board.” Section 603.66 prohibits flying a UAV without complying with SFOC requirements.

Under the CARs, model aircraft means an aircraft weighing no more than 35kg that is mechanically driven or launched for recreational purposes. The CARs do not define “recreational purposes”. However, Transport Canada’s Staff Instruction No. 623-001 points to the definition in the Aeronautics Act which turns on whether there is hire and reward, which is defined as “any payment, consideration, gratuity or benefit, directly or indirectly charged, demanded, received or collected by any person for the use of an aircraft”.

A few weeks ago, Transport Canada published an infographic on its website to assist UAV operators in determining whether they need to apply for an SFOC. The infographic states that if you are not using the UAV for “work” and it does not weigh more than 35kg, you do not need an SFOC – still no mention of academic purposes. It appears that the notice regarding the two upcoming exemptions might be the first instance of the academic purposes terminology.

What are the implications of the new terminology? Without a definition of “academic purposes”, it is unclear how far reaching the effects of the terminology will be. Is the term meant to capture research and development activities taking place in academic institutions? Will it apply to student projects outside of research lab environments? Transport Canada will need to explain what is meant by “academic purposes” to ensure that UAV operators have clarity regarding the application of the SFOC requirements in academia.

Commercial Drone Regulations – Canada vs. US

When Canadians attempt to characterize aspects of Canadian culture, it’s not uncommon to draw comparisons with the US. I recently noticed that as I respond to questions about the Canadian regulations surrounding commercial drones, I often begin by stating that our regulatory framework is quite distinct from that of the US – here’s why…

In Canada, commercial operators can apply to obtain Special Flight Operations Certificates (SFOCs) from Transport Canada. It takes Transport Canada about 20 days to assess applications, and last year the agency issued 945 SFOCs to applicants representing a variety of industries including aerial videography, agriculture and oil and gas.

Generally, the Canadian regulations do not establish bright line rules governing drone operations – for instance they do not specify whether you need a pilot’s license to complete a commercial drone flight, or whether it is permitted to fly beyond the visual line of sight. Rather, Transport Canada assesses applications using a case-by-case approach. In order to obtain approval, applicants must show that they can mitigate operational risks to an acceptable level.

In the US, the Federal Aviation Administration (FAA) has been working to develop drone regulations since the enactment of the FAA Modernization Act of 2012. Until the framework is in place, those looking to fly for commercial purposes can only proceed by exemption. Most companies have been denied exemptions, the notable exceptions being a couple of oil companies that received approval to operate drones in remote areas of Alaska.

Last Thursday, the FAA extended regulatory exemptions to six Hollywood companies looking to film using drones. Although the Hollywood exemptions represent a move in a positive direction, the restrictions placed on the companies are quite onerous, for instance the operations must take place in a controlled closed-set environment and may only be completed below 400 feet and within the visual line of sight.

By comparison, commercial drone operations are the norm in Canada and will continue to be an exception in the US until the new rules are in place.

Legal and Ethical Issues Associated with Sensor and Drone Journalism

On March 18th, the Columbia Journalism School hosted a group of academics, lawyers, journalists and makers who gathered for a workshop on the legal and ethical issues associated with sensor journalism. The event was organized by Fergus Pitt, a Fellow at the Tow Center for Digital Journalism working on the Sensor Newsroom Project funded by the Tow Foundation and the Knight Foundation.

Workshop participants covered a wide range of topics including privacy, data accuracy and intellectual property. I participated in two panels: the first featured a discussion on regulatory and intellectual property issues with Mike Hord, Electrical Engineer at SparkFun Electronics and Matthew Schroyer, Founder and President of the Professional Society of Drone Journalists; and the second featured a discussion with Deirdre Sullivan, Senior Counsel at the New York Times on risks and liabilities associated with drones.

Mike Hord led an interesting discussion on the Federal Communications Commission (FCC) rules governing the electromagnetic spectrum. While commercial entities face stringent testing requirements for electronic devices, the good news for hobbyists is that the rules permit individuals to use a single design to build up to five electronic devices without having to complete any testing. However, even though testing may not be required in these cases, individuals must comply with all applicable rules. For instance, if a device causes unacceptable interference, the user may still face legal penalties.         

Matthew Schroyer explored closed and open source models in the context of sensor journalism. Media companies that develop closed technologies can benefit from clear revenue streams from licensing activities. Although newsroom technologies remain predominantly closed, journalists are increasingly adopting open source tools. The open source model presents many advantages to journalists, for instance it promotes transparency and accountability, which are particularly important in the context of sensor journalism investigations in which accuracy and precision are critical.

Deirdre Sullivan and I explored the risks and liabilities that media companies and journalists face when developing and operating drones, an obvious concern being the risk of physical injury or substantial property damage.

Deirdre approached these concerns from a negligence perspective. Tort liability for negligence can be applied where an individual has a duty, the duty is breached and injury results. A journalist operating a drone has a duty to not place others in foreseeable risks. If the journalist breaches this duty – for example, by flying dangerously close to a crowd at an outdoor concert – and someone is injured, then it is likely that a negligence claim would succeed. Deirdre also explored the potential application of negligence per se in the context of commercial use of drones. Generally, when an action violates a statute (i.e. speeding), such action conclusively establishes negligence, hence the term negligence per se. Since commercial drone operations currently fall in a legal grey area, Deirdre suggests that it is unclear whether negligence would be presumed in personal injury claims arising in the context of commercial drone operations.

I explored the application of product liability concepts to open and closed drones, and suggested that liability is more straightforward in the context of closed drones. For example, a closed drone may be built with safety features such as ‘sense and avoid’ technology to reduce the risk of collision. If these features do not function, then the developer may be held liable for personal injury or property damage. However, if a journalist operator modifies a drone in violation of the end-user license, then the developer could avoid liability by claiming alteration as a defense, and the operator is likely to be on the hook for personal injury or property damage that occurs.  

In the case of open drones, liability is more problematic. Assume a journalist operator modifies a ‘sense and avoid’ radar and adds communication and weather modes. If the revamped drone crashed into a person, causing bodily injury, who would be liable? A court would have to engage in a complicated analysis to determine whether the underlying technology or the modified upgrade is to blame. And, the initial developer of the open ‘sense and avoid’ radar would not be able to avoid liability by simply claiming alteration as a defense.

Although open technologies may be more problematic than closed designs from a liability perspective, industry measures may be adopted to mitigate liability risks. Developers of open technologies can look to licensing as a mechanism to allocate liability and promote non-harmful and ethical use of their technologies. For example, a sufficiently and selectively open license may be used to prohibit end-users from removing safety or privacy features incorporated by upstream developers.

For those interested in further reading, the workshop papers will be published in June by Columbia University.

Privacy Considerations in Setting up Tweed’s Medical Marijuana Distribution Business

 

Image

Medical marijuana growing at Tweed’s Smiths Falls location

 

After April 1st, Tweed Inc. will be among the first businesses to sell medical marijuana in Canada. The new legislative framework that will be in effect on that date allows businesses that have received licenses from Health Canada to grow and sell medical marijuana. Tweed has started production activities in its Smiths Falls facility that was previously home to a Hershey chocolate factory. Over the last couple of months, I had the opportunity to work with Tweed to develop its privacy policy and practices to ensure compliance with the Marijuana for Medical Purposes Regulations (MMPR) and applicable privacy legislation. The following is a summary of some of the privacy considerations we looked at in establishing Tweed’s medical marijuana distribution business.

The Application Process

The MMPR require applicants registering to become clients of licensed medical marijuana producers to provide certain personal information, including their name, date of birth and gender. The MMPR also require information about the residences of applicants. For example, if an applicant does not live in a private residence, the applicant must disclose the type of residence that he or she lives in (i.e. a shelter).

Because an individual is only permitted to use medical marijuana if he or she has a “Medical Document”, a producer seeking to sell medical marijuana must be able to contact the applicant’s health care practitioner to verify the applicant’s prescription. Before this can be done, the applicant must complete a consent form granting the distributor permission to contact the applicant’s health care practitioner to inquire about the prescription.

Purchasing Medical Marijuana

Once applicants become registered clients, they can purchase medical marijuana from their distributors. Distributors are required to maintain records pertaining to purchases in order to comply with regulatory requirements. In certain circumstances, the MMPR requires licensed distributors to disclose information about their clients to the police. In the interest of transparency, Tweed’s privacy policy outlines the legal obligations regarding such disclosure and the steps that Tweed will take prior to responding to such law enforcement requests. For example, before Tweed will disclose information about a client, the police officer making the request must provide Tweed with the full name, date of birth and gender of the individual being investigated.

The Delivery Stage

The delivery stage is very important from a privacy perspective. Health Canada itself learned this lesson last November when it sent notices to 40,000 individuals using medical marijuana in envelopes showing the patients’ names and referencing the Medical Marijuana Access Program. As expected the disclosure of such personal information has resulted in the initiation of a class action lawsuit against Health Canada.

In order to maintain the privacy of its clients, Tweed will be using a secure delivery service. The external packaging of the deliveries will not contain Tweed’s name, its famous address (1 Hershey Drive), or information disclosing the medical marijuana contents of the package.

Transparency and Accountability      

As far as personal information goes, health information ranks among the most sensitive in nature as it reveals the most intimate details of individuals personal lives. Accordingly, it is particularly important for businesses handling such information to operate in a transparent and accountable manner. More information about Tweed’s privacy practices and the contact information of Tweed’s Chief Privacy Officer can be found on Tweed’s website.

*This post was written with permission from Tweed.

Canada’s Anti-Spam Legislation: What businesses need to know

Before Canada’s new Anti-Spam Legislation (CASL) comes into force, businesses operating in Canada will need to review and modify their practices to ensure compliance with the new requirements regarding commercial electronic messages and the installation of computer programs. CASL will come into force in three stages over the next few years – the following is a brief summary of the main provisions of each stage.

Stage 1 (July 1, 2014): Commercial Electronic Messages (CEM) Provisions

Subject to meeting any of the prescribed exceptions, CASL creates a prohibition against sending CEM, except in cases where the receiver has consented to receiving CEM, and the CEM meets the prescribed requirements. There are certain situations in which consent may be implied. For instance, consent is implied where there is an “existing business relationship” as defined in CASL and its accompanying regulations. An example of a qualifying “existing business relationship” is one in which there has been a purchase or lease of a product or a service in the two years preceding the sending of the CEM.

If an existing business relationship does not meet any of the conditions for implied consent, the business must seek express consent from intended recipients. A valid express consent must also meet certain prescribed requirements. For example, a business seeking consent must clearly convey that it is seeking consent to send CEM, and intended recipients must take an active step to indicate their consent to receiving such CEM. This means that standard business practices such as using opt-out mechanisms or implementing a pre-checked consent box will no longer be acceptable.

CASL also specifies certain requirements regarding the form and content of CEM. Each CEM must: identify the sender; disclose the sender’s contact information (as prescribed); and provide a mechanism to allow the recipient to unsubscribe. The unsubscribe mechanism must allow the recipient of the CEM (at no cost to them) to indicate the withdrawal of their consent, and must include the contact information of the sender which must be valid for at least 60 days after the CEM is sent. A request to unsubscribe must be given effect in no more than 10 business days.

Stage 2 (January 15, 2015): Provisions Related to Installation of Computer Programs

CASL prohibits a business from installing certain categories of computer programs on computers belonging to other people, unless the business has obtained express consent from the persons on whose computers the programs are being installed. Additionally, businesses seeking to install computer programs must comply with certain requirements regarding the unsubscribe mechanism. For instance, businesses must provide the recipients of computer programs with an email address to which the recipients may send a request to remove or disable the programs. The email address must be valid for one year after the programs are installed. In cases where consent was obtained based on an inaccurate description of the applicable computer program, the business which installed it must assist in removing or disabling the program.

Stage 3 (July 1, 2017): Private Right of Action

CASL creates a private right of action that enables individuals to seek compensation from individuals and businesses that contravene the provisions. Individuals will be able to seek compensation for actual losses, damages and expenses incurred due to contraventions. It is expected that once these provisions are in force, class actions will soon follow.

Next Steps

As CASL’s three stages come into effect, businesses operating in Canada that are sending commercial electronic messages or installing computer programs should seek legal advice to ensure compliance. This summary is intended to highlight CASL’s key provisions, and in light of the nuances of CASL and its accompanying regulations, it is recommended that businesses obtain legal advice regarding compliance.

Follow

Get every new post delivered to your Inbox.