Mar 072018
 

Okay, it’s been a while since I had a moment to update here but a WSJ article begs a bit more visibility. Yes, Facebook really is spying on you.

This doesn’t share anything we didn’t already know, but for those of you with friends who haven’t yet keyed in on how deeply they are being tracked (and not for their benefit, to be sure) this might be a useful link to share.

 Posted by at 2:56 pm on March 7, 2018
Sep 212016
 

At Quanta today a nice article written for the masses tells about formal methods.

Many of us believe that such methods are what we need in the long run, not just for security but quality in general. We can’t get much solace from a program which defies malicious use if it also fails users with benevolent needs. And it turns out that the issues of security (itself a big definition) go hand-in-hand with the business of getting your functionality right.

Please understand a particular challenge called out in that article. It talks about hardening just some critical points (for security), and the reality is: designing with mixed levels of formalisms is incredibly hard. You would not presume a house is secure for having done a really good job on properties of the deadbolt lock if your door was not correctly fastened to the hinge side; you sort of need to know things about both, not just a lot about one side.

Other analogies apply. If you take a glass of pure spring water and drop in a bit of sewer water, you end up with sewer water; if you take a glass of sewer water and mix in pure spring water, then you still have sewer water. The suggestion is you’re no better off than the poorest piece of your system, and moreover systems are often not built from discrete blocks so much as code blends. Just pouring in some quality ingredients doesn’t make the rest palatable. You usually only know something strong about a program’s value once you see it in context.

Students in our program should reasonably ask what our department offers in the way of formal methods, whether just for security or for quality overall. Today the answer is “not much”, but years ago all students starting in our introductory programming sequence learned our crafts using formal methods (functional notations and some specification frameworks pioneered by a fellow named Harlan Mills.) The canonical ‘old guy’ observation is that we consistently graduated some of the best programmers of the era, which we attribute to the fact that they all learned fine code design in a framework that rewarded structures about which someone could reason. You had to take the time to make it clean and simple so you could meet the spec and move on. The debug-by-friction habits we see students picking up in our Java sequence (and often never shaking) would not get you far. As it turns out, and judging by what we see in learning outcomes assessments, those hack-and-slash techniques aren’t getting students very far today either.

What happened? The long-standing internecine warfare between field committees in our CS department left formal methods a thing of the past; software engineering lost. Today’s leadership has pretty much put a nail in the coffin of the robust software quality perspective which helped put our department on the map. This is true in our curriculum, and to a great extent even in our research, wherein resources and policies support the pet programs of select faculty doing other things.

And what a shame. The linked article makes clear the formal methods of that era grew and are blooming today, in ways we might never have dreamed but had certainly hoped. It is a space in which Maryland might have led. Our courses would certainly not be using tools from a couple decades ago – after all, our insights would improve over time as did everyone else’s – but we’d at least be in the game and talking authoritatively about the big picture of quality. Our campus leaders instead have just what they engineered: a department that, lacking its own vision, whores after dollars in a cybersecurity market built by others instead of setting the industry’s standard for quality (of which security is one piece).

 Posted by at 10:53 am on September 21, 2016
Feb 272015
 

We commend to you a nice little back and forth on the merits of cost estimation in software development processes. There is no overwhelming argument made in opposition to estimation practices, but at the same time it recognizes what a lot of us know, that a lot of people do estimation without understanding estimation. So of course the practice can have what is at best described as mixed results. This article is worth a read by our students of software engineering.

 Posted by at 8:35 am on February 27, 2015
Nov 232014
 

A truly marvelous article titled Stop Trying to Save the World describes how “big ideas are destroying international development.” With many examples to illustrate the point, the author calls out small ‘game changing’ projects which garner accolades and fanfare when piloted in the small, but which as scaled up efforts subsequently fail – and without much fanfare.

People intent on genuine advocacy of issues will find much in this piece, which is credibly researched and presented. Are you concerned with the effects of big government, which similarly looks for big game-changing ideas but doesn’t follow through to find what are the core pressure points to hit or know when to stop? (Are you channeling for Pat Moynihan?) You’ll see the same points here. Are you concerned with the fed’s penchant for investing in small research pilots, seeing good preliminary results and then presuming it will all work on a more grand scale? (Do you believe Fred Brooks when he asserts there is No Silver Bullet to jump start jeopardized projects, or to let you bypass careful attention to detail down in the weeds of your project?) You’ll find the author of this piece a fellow traveler.

Even if on a different scale this article introduces a notion which should be pressed upon every administrator on this campus, which is the danger of broadly applying some major policy or curriculum change without first understanding what were the essential preconditions to a pilot’s earlier success. Having done something special once, in controlled settings with a spectacular professor at the helm, we presume the essence of that success to carry forward is the pilot’s title, its powerpoint slides or videos, and the course eval figures (aka popularity) of the professor. (In worse cases, we launch changes without bothering with the pilot, then just declare it a success without any dispassionate assessment of whether this takes us closer to or further away from some core scholarly objectives.)

All of which is to say, as the author of the linked article was one step short of expressing, that we’ve lost the will or capacity to do substantive measurement.

 Posted by at 9:43 am on November 23, 2014
Nov 042014
 

Price discrimination on the internet is the practice of showing consumers different options and prices depending on profiling information compiled by the various vendors involved. Today’s WP has a nice exercise in trying to untangle some of what goes on in the business logic.

As you read that article (and we recommend that you do) keep in mind that profiling isn’t just to determine what hotel rooms to offer or widgets to push first at a shopping site, it can be used to determine what performance you might see from your service provider (possibly on a session to session basis). what search results you will see from your visit to Google or Bing, and someday even what connectivity you might have to one or another part of the deep internet.

 Posted by at 8:40 am on November 4, 2014