Feb 022016
 

Students and alumni at UC Berkeley have filed a lawsuit against Google for its practices of data mining and profiling their email traffic through Google’s “Apps for Education” services which it promotes widely – including on this campus. The suit claims this is a violation of the Electronic Communications Privacy Act.

Google appears to confirm the practice but asserts that while profiles are created for everyone who uses these tools, it does not target individuals for advertising based directly on the user’s information. However the company has so far been silent on how it uses these data for its other purposes, and presumably at some point will need to argue that those uses, while profitable and exploitative, are technically not a violation.

There is no such thing as a free lunch, so for users who obtain services at no direct charge from Google, it is not clear what they think is the business value to Google if not to train fairly elaborate models to recognize someone having exactly the individual’s features, and then sell use of that model to companies or government officials who want people identified. Those uses are surely good for Google, corporations and officials, but for consumers, not so much.

Google’s practices have been the open elephant in a room that few involved have an interest in acknowledging. School officials in particular have strong motivation to pay for their digital infrastructure out of their students’ liberty and pockets, and interests of those students be damned. (At UM, the message is also employee interests be damned, as we convert faculty and staff services to Google over the course of this year.)

What brings the present case forward is an assertion by the students that an earlier Google representation (that they would stop direct advertising based on the student data) was an admission that the were violating the Act in contrast to promises made at the time. Those promises are not unlike those made to students on this campus when we directed all traffic through Google servers.

 Posted by at 8:39 am on February 2, 2016
Jan 102016
 

The Washington Post reports on the cutting edge software police are starting to use for identifying people who are possible threats. One example of its use involved flagging someone as a threat based on a 911 call, so officers could call in a heavier response. It is all based on searching police records and social media.

Anyone who thinks the social media part is searched in real time once a name or address turns up in a query will be sadly mistaken. An immense amount of static information is compiled and added to continuously so it will be available at a moment’s notice – like in a 911 call.

One of the several dangers of course is that they get it wrong, and your innocent actions become misinterpreted, with potentially deadly consequences. Oops. But can you control this? No. As with so many “homeland security” systems these days, police cloak the actual computation in commercial operations, where the algorithms, data sources and records are not subject to public information requests or challenge. It will only be a matter of time until the equivalent of a ‘Google Bomb’ is dropped on someone through social media, after which the next interaction with police could involve trying to persuade a tactical team you are not a threat … while zip tied face down on your living room carpet.

 Posted by at 9:52 pm on January 10, 2016
Oct 282015
 

While the practice is not widely acknowledged, recent reports shed some light on the immense scope of telephone company use of information about you. Especially with mobile devices, usage data paints a sharp and clear picture of you, and this in turn is readily monetized.

How very generous of you to share your shopping, location and personal network information with these data wholesalers for free, since they use these data to take even more value from you later.

 Posted by at 7:57 am on October 28, 2015
Sep 252015
 

Spoiler alert: the answer is “poorly”.

Government regulations continue to expand and options available to consumers corresponding diminish, as called out in the linked article. Bureaucrats make choices about what is ‘best’ for consumers but these often fly in the face of choices that rational consumers would make in their own interests; officials’ track record is one of promoting decisions which are best for them … not consumers.

 Posted by at 9:36 am on September 25, 2015
Jun 292015
 

When a Company Is Put Up for Sale, in Many Cases, Your Personal Data Is, Too” says the headline at the linked article (and never mind their singular/plural mixup…)

Data agreements are usually entered into implicitly and in the best of times, but really, contracts should govern expectations for what are the worst of times. A company’s dissolution, and thus the liquidation of its assets, means your data might go on the auction block too. And those who get your data might not be so caring about it as the earlier owner.

 Posted by at 5:50 pm on June 29, 2015
Jun 052015
 

We often hear about the importance of computers in education, but this message often falls flat to many of us who have tried to divine the recipe for whatever is the secret sauce which makes computer-involved instruction work. It often doesn’t. Thanks now to this author for giving voice to the matter who calls out in detail how technology in challenged schools can only amplify the problems. The loudest calls for increasing tech in education unsurprisingly come from the very companies which stand to benefit most from the policies; the linked article calls out why this is not necessarily the best way forward.

 Posted by at 10:40 am on June 5, 2015