More on the state of scholarly publications these days in What I learned from predatory publishers, an opinion piece in a special issue at the linked site.
College education is supposed to involve more than just ‘knowing stuff’. It might be just that in the minds of bean counters who increasingly drive campus policies. After all, they want bigger revenue streams and lower overhead, so the faster they can declare delivered! on an undergrad experience, the sooner they can bring in the next customer. Get ’em up, move ’em out! Nevertheless, classically it was a lot more – and our society needs it to be a lot more.
That’s why interested scholars look cautiously at articles like Many Colleges Fail to Improve Critical-Thinking Skills. And you should too. (Sorry about the pay wall, but you should be able to get that from on-campus accesses.) Same for Is the U.S. Education System Producing a Society of ‘Smart Fools’? We might not like the answers to that question.
Objective evidence of our collective failures on campus can be found everywhere. We see activities reported such as The Campus Inquisition at Evergreen State College, and also Those ‘Snowflakes’ Have Chilling Effects Even Beyond the Campus. (Another pay wall.) These articles report how communication skills are going down, perspective is narrowing and any sense of respect for diversity or selfless dedication to causes greater than an individual is becoming lost.
[And there are plenty of examples of how the products of today’s educational systems think that there ought not be consequences in the marketplace, which historically has served a hard but most excellent mechanism for quality improvement. Take for example: The tech world is rallying around a young developer who made a huge, embarrassing mistake.]
Utterly not a coincidence is the economic link, as mentioned in The US college debt bubble is becoming dangerous. In fact some of us have been trying to raise the alarm that higher education is in a bubble which is bursting around us. (What a shame College Park is driving itself to a place that will not allow our campus to be one of the survivors, much less to a leadership role in transforming higher education for the better.)
Time for another roll up of relevant links!
Yes, what follows are all quick reads that I should have posted as I saw them, but let’s admit it: keeping this pace up when there is no expectation it can be used is pretty depressing. I started this site to (in part) seed a small repository with articles my students might use as examples of topics of interest to scholars in this field. Honors seminars, leadership development classes and more are places where we offer not just content but also practices and temperaments of scholars in our field. How better to show deliver these than by sharing notes on what I follow?
What a shame that neither the CS department nor Honors College actually want seminars, leadership development classes or more variety in our courses – not unless they are taught by one of the ‘in crowd.’
Some of us may be cultural pariahs but that doesn’t mean we stop learning, thinking or critiquing, so without further ado, here are some recent examples of how to be a skeptical scholar.
The Most Important Scientist You’ve Never Heard Of is a great narrative about the discipline, objectivity and passion that scholarship demands … and the advocacy value with which it is rewarded.
Bullshit is a commodity much in supply on this campus. Exercise for the students: see if you can apply tips below to sniff out which campus programs shovel bovine scatology as compared with offering you genuine value. The Baloney Detection Kit: Carl Sagan’s Rules for Bullshit-Busting and Critical Thinking is timeless even if a bit lacking in specifics. Pocket Guide to Bullshit Prevention gives you another compact checklist. But in How to call bullshit on big data you can read about how scholars elsewhere teach ways to combat BS. (How interesting that that campus and ours treat the same topics so differently. One teaches methods that the other teaches must be sniffed out. Well.)
What’s the effect of sham scholarship? You publish papers like The Conceptual Penis as a Social Construct: A Sokal-Style Hoax on Gender Studies. That paper is 100 percent USDA choice bullshit, and it made a splash as such on the internet when word of it recently came out. Then there is Dog of a dilemma: the rise of the predatory journal – more of the same essentially. You will see these periodically. (Save them when you do, and even think about posting them here!) The business model of scholarship is supposed to filter out material like that before it piles up and starts to smell up our fields. Yet … there they are linked. It is probably useful to ponder what equivocal assertions are populating our fields if even these blatantly silly works can appear. It is even more useful to ponder what scholarly practices will help you tell which are which.
Then there are the thorny examples. Daryl Bem Proved ESP Is Real is one. Figure that one out! There is in general a lot of valid discussion of just how well some fields conduct experiments too. Why we can’t trust academic journals to tell the scientific truth takes that up, but there are many other threads around the web too. Take for example Data, Truth and Null Results.
Let’s make sure our work will be cited in others’ blogs because it illustrates great qualities and positively influences the field … not because it is a teaching example of what other schools what students to know how to sniff out!
The Marine Corps culture of innovation gets some great visibility in this light HuffPo article.
Knowing how to distill quality – and do it fast for cheap – is central to my message in SEAM about why software engineering is more important than ever. Most attention in the industry is on cyber, cyber, cyber, but really, no stakeholder would be much happier for his system being down due to bad design as compared with being down for some outside hack. Down is down.
Quality is a holistic thing, and security is a piece of the quality puzzle. My view: knowing how to predictably make systems of good quality – they work and are secure – is important but knowing how to do that lean will be the life blood of any economic rebirth in this country. The field is actually less able to do the predictable part today as compared some years ago, so UM’s role in this should be to lead the way.
The saying used to be “pick any two” but we need good, fast and cheap, which don’t come as a set from most crap-for-practices software development environments today. (Yes, ridiculous practices are promoted on this campus too, since Main Admin has business incentive to cash checks and move students out the door fast without regard for long term impacts. A pity we don’t do this right.)
A renaissance in quality needs more than promotion of technology, of course; contracting practices need some liberation as well, since the exclusive club of companies which might know how to do things both good and fast have very little incentive to agree to do it for cheap. That exclusive club will only become more exclusive over time if there isn’t strong leadership. The lore of evidence-based process improvement in software systems will continue to fade.
Good on USMC for promoting a culture of innovation, quality, measurement and continuous improvement; too bad for UM that our culture does not reflect the same virtues.
At Quanta today a nice article written for the masses tells about formal methods.
Many of us believe that such methods are what we need in the long run, not just for security but quality in general. We can’t get much solace from a program which defies malicious use if it also fails users with benevolent needs. And it turns out that the issues of security (itself a big definition) go hand-in-hand with the business of getting your functionality right.
Please understand a particular challenge called out in that article. It talks about hardening just some critical points (for security), and the reality is: designing with mixed levels of formalisms is incredibly hard. You would not presume a house is secure for having done a really good job on properties of the deadbolt lock if your door was not correctly fastened to the hinge side; you sort of need to know things about both, not just a lot about one side.
Other analogies apply. If you take a glass of pure spring water and drop in a bit of sewer water, you end up with sewer water; if you take a glass of sewer water and mix in pure spring water, then you still have sewer water. The suggestion is you’re no better off than the poorest piece of your system, and moreover systems are often not built from discrete blocks so much as code blends. Just pouring in some quality ingredients doesn’t make the rest palatable. You usually only know something strong about a program’s value once you see it in context.
Students in our program should reasonably ask what our department offers in the way of formal methods, whether just for security or for quality overall. Today the answer is “not much”, but years ago all students starting in our introductory programming sequence learned our crafts using formal methods (functional notations and some specification frameworks pioneered by a fellow named Harlan Mills.) The canonical ‘old guy’ observation is that we consistently graduated some of the best programmers of the era, which we attribute to the fact that they all learned fine code design in a framework that rewarded structures about which someone could reason. You had to take the time to make it clean and simple so you could meet the spec and move on. The debug-by-friction habits we see students picking up in our Java sequence (and often never shaking) would not get you far. As it turns out, and judging by what we see in learning outcomes assessments, those hack-and-slash techniques aren’t getting students very far today either.
What happened? The long-standing internecine warfare between field committees in our CS department left formal methods a thing of the past; software engineering lost. Today’s leadership has pretty much put a nail in the coffin of the robust software quality perspective which helped put our department on the map. This is true in our curriculum, and to a great extent even in our research, wherein resources and policies support the pet programs of select faculty doing other things.
And what a shame. The linked article makes clear the formal methods of that era grew and are blooming today, in ways we might never have dreamed but had certainly hoped. It is a space in which Maryland might have led. Our courses would certainly not be using tools from a couple decades ago – after all, our insights would improve over time as did everyone else’s – but we’d at least be in the game and talking authoritatively about the big picture of quality. Our campus leaders instead have just what they engineered: a department that, lacking its own vision, whores after dollars in a cybersecurity market built by others instead of setting the industry’s standard for quality (of which security is one piece).
A few articles on university competitiveness turn up with this morning’s coffee. A fine article from the Foundation for Economic Education (whose links you see sprinkled around our web site here for good reason) addresses what we would call the mission drift on campuses; as stakeholders are freed to define more of their own roles, understandably many define them to be something other than the mission which created a campus in the first place.
An Atlantic article comments on administrative bloat. The author could very well have said “see Maryland.”
So it is also thus not necessarily a coincidence that we place where we do in the latest US News and World Report rankings which just came out. College Park lives among the also-rans (tied for number 60 with with Fordham, Purdue, Syracuse, Connecticut and WPI), and a drop from last year. (Formerly at 19 among public schools, now at 20.) Compare for yourself at USNWR.
The Purple Line’s funding issues have recently slowed its early construction efforts, but never fear, its proponents – including, we presume UM President Wallace Loh, who single-handedly overcame local opposition and championed this campus-splitting project’s approval – remain optimistic about its prospects.
Which is more optimism than we can muster for traffic conditions during said implementation based on reading the Washington Post’s article about a similar light rail project in Charlotte. Read for yourself the devastating effect that project has on the region there.
Students in my classes know how often I advise them to “call your shots” – that is, do an honest and personal self-assessment of performance (whether on a project, in a class or on your job.) Only by genuinely understanding the difference between your aspirational and actual outcomes (and why they came to be different!) can you become effective at bringing the two into alignment. Lies are things we tall at a bar, not what we tell ourselves. (Another colloquialism from Purtilo: “Never believe your own b******!”)
Doing such an assessment is often difficult, not only because it brings us face to face with outcomes that are sometimes short of what we wanted. It takes practice. So when we see good examples of how this is done (and especially with analysis as to why, so one can improve) we really sit up and take notice.
That’s the case this morning with a spectacular assessment from Nate Silver, at his site fivethirtyeight.com. Silver’s piece is titled “How I Acted Like A Pundit And Screwed Up On Donald Trump” and he goes into excellent detail about the statistical methods that worked (or sometimes didn’t work) in his predictions on this year’s races to date.
Forthright assessments are a hallmark of serious scholars, and I commend this to you as a great example. This should be all the more interesting to some on campus since of course Silver had been one of the First Year Book authors on our campus.
… and not that other stuff. That’s effectively what Facebook provided, according to “news curators” who had worked there and were interviewed for a Gizmodo article. The effect was to put a thumb on the scales of public opinion, biasing it toward promotion of liberal views and suppressing material that might have reflected conservative opinion, or so was the article’s point.
And that point would be quite plausible when you have an unchecked system that relies upon promotion of articles by people who are drawn from a pool that itself is dominated by certain views. Selection bias nuances the choice of curators, and the curators thus bias the messaging by what they choose to promote. How do they recognize a likely article? They’ll see it when they know it.
Company Demolishes Wrong Housing Duplex Following Google Maps Error. The banner there sums it up.
Do you think perhaps people are getting a little too reliant on technology that is offered to consumers only as convenience?