Analytics and Forecasting

Forward to the Past

10 Feb 2016  

I used our recent office relocation to review some files I had not visited in a while. Though an arduous undertaking, it provided some surprising rewards. Among other things, I came across one of my first major projects, done with the great firm of Peat, Marwick, Mitchell — which soon after became KPMG.

Screen Shot 2016-02-10 at 1.41.56 PM

Out client was the Department of Taxation and Finance for New York State. I worked under a business economist, Don Welsch — a brilliant and fun guy, a real visionary, and a good mentor to me.

My job was to develop a model for the economy of New York State, segmented into about 150 sectors. The State wanted to have a computer model ready to go so they could change sales tax rates (read:  raise taxes) in selective categories if they needed to rapidly plug a revenue gap.

I had in effect become the firm’s reigning guru in sales tax during a previous engagement, and before that had studied data analysis and linear modeling at Yale. This was a unique opportunity to fuse and leverage both cognitive streams.

Things were different then

It hardly needs saying that many aspects of this assignment were much different then than they would be today. It was the Digital Dark Ages! (Though of course we didn’t know this at the time. We were just trying to get our work done in the best way possible.) The Internet was still a dozen years in the future. Personal computers were just barely on the horizon, and nowhere to be seen even at well-funded firms like KPMG.

ti_silent700_1976Don had a relationship with Chase Econometrics, a timesharing service that we used for time-series data by dialing in on a TI Silent 700 terminal through an acoustic coupler, into which we plugged a landline phone, as you see on the left. Results came back to us dot-matrix-printed on proprietary (and correspondingly expensive) thermal paper at a blazing 300 characters per second.

That was a quick sprint through the infotech graveyard — but, for its time, our process was state-of-the-art. The Silent 700 was one of the first portable terminals to achieve major commercial penetration, so we could work from an office (ours or those of our client), rather than have to go to a data center. Many of our consulting colleagues regarded us as futuristic space cadets.

In a sense, we were doing big data long before anyone had even found need of the term. Yet, at the same time, it was all reassuringly artisanal. We had to call first by phone to make sure the tape operator at the Chase offices in Massachusetts had the correct tape loaded on the mainframe at their end.

We also used other sources that were non-technology-based. I remember visiting the US Commerce Department in Washington often so we could make full use of their Personal Consumption Expenditures (PCE) data, which at that time were available only in a paper publication.

The model I developed contained measurements of revenues and growth rates for about 150 economic micro-sectors, including both products and services important to the New York economy. When I finished doing the research, I had a guy program the interactive portion of the model using timesharing on a mainframe — because PCs were not yet available to us. (The Apple II was around, but few of us big-firm New York guys took PCs seriously as tools for client work until the IBM PC appeared in 1982.)

Our research apparently got some traction in the world of municipal finance, and was published in a paper (“Estimating Disaggregated Sales Tax Impacts:  An Empirical Investigation”) that Don and I presented to the Southern Regional Science Association convention in Knoxville, Tennessee in 1982 — my first published paper.

Yet things were much the same

Our tools were primitive by today’s standards. Yet our client’s problems and opportunities, and the use case for our work product, were much the same as they are to this day. Decision makers (i.e., legislators) wanted to take actions (i.e., change in tax rates) based on empirical evidence — not just by winging it and hoping for the best outcome. They needed us to (1) develop and curate the information, and (2) make it readily accessible and usable when needed.

We succeeded in putting the NYS economy into an electronic “box” that could then be pushed and poked by decision makers to forecast and create real-world results. The knowledge we provided them became a springboard to their value-enhancing actions.

In that sense, this project from over three decades ago was essentially much the same as what I do now. It would of course be much easier (and faster and less expensive) now with PCs and the Internet.

Looking back on it, I am proud of this engagement. It was empirical, it was value-relevant, it was usable, and it was dynamic — tests for quality that I still apply to my work products.

I am extraordinarily fortunate to have been able to pursue this work that I find continually interesting, challenging, and (dare I say) useful both to my clients and to society.


Leave a Reply

Your email address will not be published. Required fields are marked *

Recent Comments

  • Tim Powell on A brief pause: “Thanks, Les. My frequent conversations with you over the past year or two has helped my thinking a lot in…Jan 15, 13:07
  • Les on A brief pause: “Excellent advice thank you for your terrific reflection piece Tim!Jan 15, 09:40
  • Glenroy London on Knowledge Erosion: How to Avoid It: “Hi Tim I am knoco caribbean. About to join the global km family. Exploring km frameworks for design, development, implementation,…Jul 12, 08:52
  • Tim Powell on War of the Words: “Glad to oblige TJ — and thanks for your note — but I do encourage to try it for yourself…Jun 23, 08:10
  • T J Elliott on War of the Words: ““Chat credited me with founding and/or leading 20 different companies and writing 13 books. In fact, I founded one company…Jun 22, 22:42