Monday, December 29, 2008

The Principal Investigator's Burden

After a day of pushing papers in a empty building, in an abandoned university, where the students have fled for the Holiday break, and my colleagues who have no external support are also away with their families, Eli came across this editorial by Alan Leshner in Science

Reduce Administrative Burden

and the bunny said

YES!!!!!! *$&&#&# YES

Leshner goes on to say
A 2007 survey by the U.S. Federal Demonstration Partnership (A Profile of Federal-Grant Administrative Burden Among Federal Demonstration Partnership Faculty) found that 84% of faculty in the United States believe that the administrative burden associated with federally funded grants has increased significantly in recent years. Most notably, the study indicates that of the total time that faculty devote to research, 42% is spent on pre- and post-award administrative activities.
Leshner considers goals
An ideal goal would be for every science-related rule or regulation to be rationalized and streamlined. As a group, they should be integrated as much as possible so as to reduce unnecessary duplication. New versions should address the lack of uniformity across agencies
and solutions
Whoever takes the lead in reducing administrative burden might consider a somewhat unorthodox approach to reviewing and revising existing regulations. Rather than starting with the evaluation of each existing policy one at a time, it might ultimately be better to start anew from an integrated list of all the issues that must be addressed, and then take an entirely fresh look at what rules and regulations should be applied. Although this might trigger fears of "reinventing the wheel," it also might prove the point of another old adage: "Never underestimate the value of 'square one.'"
Comments?

6 comments:

John Mashey said...

Yes, this is similar to:
1) Software
2) The IRS Tax code

all of which get more complex over time and rarely get simpler without being redone from scratch.

Dad worked for the IRS for a while. Although thus being able to obtain excellent advice "Dad, will they audit me for this? Yes." I sometimes expressed by irritation at the complexity.
he said, basically, that the IRS worked very hard to simplify things, but couldn't keep up with Congress.

I've often thought that one needs absolute limits like:

a) The software can be no bigger than X
b) All the relevant laws are limited to some total words

and in either case, you have to get rid of something if you want to add something else.

OR

c) You assign your best software people to removing code and redundancy [there always is some], assuring them that their salary won't get hit for producing negative lines of code.

d) In a bicameral legislature, one house passes laws and the other repeals them.

Alas, although I've been able to do a) and c) when I was a software engineering manager, b) and d) seem unlikely.

In the case at hand, it won't happen unless it's made an explicit job for somebody and heavily backed.

EliRabett said...

Given how spare software used to be when there was a 32K memory barrier and 8 bit processors, perhaps along with the carbon tax we should introduce a memory tax, with a luxury surcharge about 4 GB?

(Ms Rabett always did say that every man needs more memory)

John Mashey said...

When we shifted from PDP-11s to VAXen, and virtual memory limits rose from 64KB I + 64KB D to gigabytes, many things got easier, and correctly so. Some of us worried that lean programming might be endangered, and that was correct also.

When I take younger folks around the Computer History Museum, they laugh at the absurdity of memories measured in small numbers of kilobytes, and seeing punchcards, they know no one ever could ever written programs that way. Ahh, for the "good old days."

On the other hand, having helped design the first 64-bit micro, and systems that actually ran individual programs approaching 1TB of real memory usage, and help get the necessary changes into C, I would find it inconsistent to push for a tax on big memories!

See: Long Road to 64 bits.

Anonymous said...

Einstein put it best

It can scarcely be denied that the supreme goal of all theory is to make the irreducible basic elements as simple and as few as possible without having to surrender the adequate representation of a single datum of experience.

Same goes for nearly everything else.

I have worked as a programmer for over 30 years and from my experience, that certainly applies to software.

I also paint as a hobby and it is also true there.

After things have reach critical mass (mess?), it IS usually best (and easiest) to simply thow everything out the window (or Windows, in the case of Microsoft) and start from scratch.

That may be one of the hardest things to learn. It sure was for me. It took me YEARS.

Anonymous said...

No matter how careful you are, I think every computer program inevitably reaches a point at which the bugs outnumber the useful features.

That's the time to call it quits, IMHO (or even before it reaches that point)

Otherwise, it just becomes bugs all the way down (bugs to infinity...and beyond!)

I've debugged programs in my day (ones that i did not write, of course) when i was not sure what was a bug and what was not.

Pretty sad, I know, but it happens all too frequently, unfortunately.

David B. Benson said...

From Strunk & White, "The Elements of Style":

"Simplify, simplify, simplify!"

[The word verification agrees, stating simply "bless".]