Technology

The Unspoken Contract You Have With Your Data

The data you acquire—no matter its source—generally comes from someone who was willing to share it with you. You have a responsibility to do the right thing with it, and could run into trouble if you don’t. Here's why data ethics is becoming harder to ignore.

When someone joins your organization or does business with you, he or she expects certain things out of that transaction. You create a social contract of sorts, a set of parameters for which you’re allowed to act in.

And when you stretch beyond those parameters, you threaten that entire relationship.

Such is the tale of a controversy that iRobot stirred up last week. In comments to Reuters, Colin Angle, the CEO of the robotic vacuum maker, hinted that its cloud-based mapping features could be sold to other companies.

“There’s an entire ecosystem of things and services that the smart home can deliver once you have a rich map of the home that the user has allowed to be shared,” Angle told the news outlet.

It proved a bad look for iRobot, and the company received a lot of criticism over it. Reuters has since corrected the story to emphasize that the company wasn’t actually looking to sell your data to random companies but that it might share data if given consumer consent. Gizmodo, among other outlets, was very skeptical about this, noting that a glance of iRobot’s terms of service and privacy policy suggested the company could do a lot with your data.

“At a glance it might seem like there’s only a narrow set of circumstances for third parties to get ahold of your info, but in reality, these guidelines give the company tons of freedom,” the outlet’s Rhett Jones wrote last week. “It can share your data internally, with subsidiaries, third-party vendors, and the government upon request.”

(Even after the Reuters correction, Gizmodo wasn’t impressed.)

A Questionable Data Point

To be honest, this is a situation that became much more controversial because of an error in a story. Skeptics may disagree, but it feels like a nothingburger in retrospect.

Nonetheless, the episode proves a good entryway into a topic that’s worthy of pondering with more depth: how organizations use the data of their members, and what is considered out of line.

A vacuum company selling information about your home feels out of line. So, too, it might seem out of line for political candidates to sell their email lists to outside parties, including the very candidates they competed against just a few weeks earlier—but it happens, and it leads to awkward situations.

When someone hands an email address, a phone number, or a physical address to an association or a nonprofit, it’s often given with a certain expectation—that a service will be rendered, that the person might receive marketing or a pitch for donations—and if that’s how the rules are framed, it might feel like a betrayal if you move beyond those rules, or break those expectations.

Granted, laws such as the U.S. CAN-SPAM Act of 2003 or Canada’s Anti-Spam Law set rules of the road for how we use data in certain contexts. But they only go so far. And while things like privacy policies and end-user license agreements can help define the path, they have basically the opposite problem: The level of specificity is so high that your most dedicated users will ignore most of the salient points.

That’s why, as I pointed out in a 2014 blog post, users are often more willing to say that they want privacy than they are to take steps to keep it. It’s not that they changed their minds; it’s that their resolve has been beaten down to the point where they’ve accepted it, even if they don’t exactly like it.

But, as the iRobot controversy shows, there are still limits to how far the public is willing to go in the name of a product—even something as cool as a robot vacuum.

Acquiring Data the Right Way

One thing I covered recently that I think really speaks to this point is how the American Association for the Advancement of Science is adding a paywall for the daily news site associated with its Science magazine but, as part of its initial experiment, readers are “paying” for access with data—first an email address and, later, with more demographic information.

It’s just a test for now, but it puts AAAS in a position where it’s setting an expectation with readers on how their data will be accessed and shared: If they want more stories, they have to offer more data. If they don’t want to offer more data, the stream of stories stops. Fair trade.

Odds are that far fewer will complain about this test than the recent hubbub with iRobot, because neither the acquisition of the data nor how that data will be used in the long run feels out of left field.

There’s a right way and a wrong way to acquire data, and there’s a right way and a wrong way to use the information you acquire.

The public has been burned by bad actors who push the edges of data ethics. Setting expectations the right way might just become a competitive advantage someday.

iRobot, the maker of the Roomba, has faced controversy over a Reuters story that implied it would sell user data. The story was later corrected. (MacDX1/Flickr)

Ernie Smith

By Ernie Smith

Ernie Smith is a former senior editor for Associations Now. MORE

Got an article tip for us? Contact us and let us know!


Comments