Economics Network CHEER Virtual Edition

Volume 9, Issue 1, 1995

Point and Shoot!

A Report on the computing sessions at the AEA programme at the ASSA meetings in Washington D.C., January 1995

Guy Judge
University of Portsmouth

It was early afternoon on January 9th. I was standing on the steps at the Terrace entrance to the Washington Hilton and Towers Hotel. The Head Porter walked towards me and smiled. A black man in his fifties, his spotlessly clean uniform and polite demeanor conveyed a feeling of pride in what he was doing for a living. "Is this the right place to catch the Washington Flyer back to Dulles Airport?" I asked him. "Sure is, Sir", he said. "It's also the exact same spot where Ronald Reagan got shot by John Hinkley".

What a cheerful thought! But in the homicide capital of the US the only pointing and shooting I witnessed was in the computing sessions which were part of the American Economic Association's programme at the Allied Social Science Associations meetings. Organised again by Bill Yohe of Duke University and, for the last time, Mike Lovell of Weslyan University, these sessions were as popular and interesting as ever.

Bill Goffe, Bob Parks and George Greenwade combined to put on a "What's on the Internet for Economists?" demonstration. Bill Goffe, from the University of Southern Mississippi, is well known for his highly informative "Resources for Economists on the Internet" guide - available either in electronic or printed form from various sources including the CTI Centre for Economics. Bob Parks looks after the electronic Economics Working Paper Archive at the University of Washington - an extremely valuable resource which is open to all economists across the world with Internet access of any kind. George Greenwade of Sam Houston University runs one of the largest gopher services for economists on the Internet. What they put on was a (partly on-line) demonstration covering all aspects of the tools and resources on the Internet of interest to economists. The session that I attended attracted upwards of eighty people with some spilling out into the corridor. The organisers had anticipated the tremendous interest that such a demonstration was bound to create and sensibly arranged for two repeat sessions later in the conference programme. They too drew big crowds.

In a fast moving session Bill Goffe gave a brief history of the Internet, explained some of the key concepts and then illustrated what he had been talking about with a mixture of on-line and previously recorded interactive sessions which were beamed up onto the screen by Bob Parks who operated the computer link up. I attempted to note down as much as I could for this report but at times it was hard to watch, listen and write at the same time.

I am grateful to Bill and Bob for granting me permission to publish the printed handouts which they produced to accompany their talk and which follow this report.

Bill Goffe began with some background on the Internet itself. He emphasised its decentralised nature, comprising as it does an network of networks all following the TCP-IP protocols. He showed a schematic electronic diagram of the Internet displaying the backbone of the ANSnet (over which the NSFNET and various commercial services operate) together with links to international, regional and local networks. Information can be moved around the system in the form of packets which are routed through to an address on the Internet from one computer to another. But all of this happens behind the scenes and the detail of it need not concern ordinary users for whom the system is now highly user friendly, particularly since the development of the World Wide Web (WWW or W3) and the associated "browser" software programs such as Mosaic and Netscape. Describing Mosaic as the "Swiss army knife of the Internet", Goffe enthused over the flexibility and versatility of such graphics Web browsers.

In all likelihood the availability of such easy to use tools accounts for much of the recent phenomonal growth of the Internet, in terms of host computers connected, messages carried and services available. After increasing at a rate of about 5% per month for the last few years (doubling every two years) the number of host computers connected to the Internet is now doubling every year. The proportion of Internet traffic accounted for by the World Wide Web has already reached around 8.5% and because Internet browsers can also simplify ftp and gopher connections it will soon take over as the most important Internet protocol.

Goffe explained the format of a typical email address and host computer name. For example his own email address of is based on the standard IP address with named computers and domains in a sequence separated by dots. On the far right of the address we can see from the edu that it refers to an educational institution - other categories include commercial (com) and government (gov) organisations. The address becomes more specific as you move to the left, identifying the organisation (the University of Southern Mississippi - usm) and the computer or server to which messages are directed on the far left. For countries outside the US an extra element is added on the far right to identify the country (eg uk for the United Kingdom, au for Australia etc.). Other countries may also have slightly different labelling systems - for example in the UK we use ac to denote academic institutions and co for companies.

Goffe then spoke briefly about the World Wide Web and the client- server architecture on which it is based. The client is the software on your computer which enables you to connect up with the remote host computer (server) you want to access by means of an addressing system based on the URL (Uniform Resource Locator) format. A URL consists of three parts:the protocol which is used (telnet, ftp, gopher or http), the host computer address and finally the location of the resource on the host machine (directory/subdirectory). The general format is protocol://host/location.

FTP is the protocol for moving files between a remote computer and your own machine. With anonymous ftp when you logon you respond with "anonymous" when asked for a user name and you supply your Internet address as the password. Telnet allows you to logon to a remote machine to make use of the services on that computer (provided you have permission to do so). Gopher allows you to access network resources by means of an on screen pointer within a common menu structure. The http prefix stands for HyperText Transport Protocol which is the protocol for accessing World Wide Web documents. Such documents have built-in hypertext links to enable you to jump to other parts of the document or to other documents located on computers somewhere else on the World Wide Web. They can incorporate graphics and even sound and video clips (although you may need additional hardware and "helper" software as well as your browser to enable you to experience all the features of such resources). There is now so much out there on the Internet in this format that we may end up by-passing the CD-ROM based material altogether.

Before looking at each of these protocols in more detail Goffe digressed and turned to e-mail and e-mail lists. Comparing email favourably with both fax and regular (snail) mail services he emphasized that not only is e-mail quicker but you get an electronic document that you can manipulate. E-mail tends to be less formal than ordinary mail which can help flatten hierarchies. He talked briefly about issues of "netiquette", privacy, and the use of UUENCODE and UUDECODE tools for converting binary (non-ASCII) files into a form that can be e- mailed.

While ordinary e-mail is fine for one-to-one communication, mailing lists open up the possibility of communicating simultaneously with many others with a common interest. The example provided was the Pol-Econ mailing list which George runs. Usenet groups are based on a similar idea but you must have special newsreading software to interpret the messages - you can't just use e-mail.

Goffe then returned to the four main protocols, begining with ftp. Although ftp is one of the oldest tools available on the Internet, Goffe emphasised that file transfers using this protocol could now be accomplished in a variety of ways. In addition to the traditional command line approach (using keywords such as get, put, etc.) it is now possible to use gopher or World Wide Web clients to achieve the same effect more conveniently. This was illustrated by Bob who showed various screen captures from file transfer sessions. Here Bob also talked about the use of zip and unzip programs for file compression and quicker transmission.

Turning next to telnet we saw how you can use this protocol to logon to a remote computer to use programs and other resources such as library catalogues held there. If you are asked about what kind of terminal you have you should reply VT100 which is the most basic type in use. Bob illustrated the protocol by telnetting through to the Iowa Electronic Markets to look at a number of markets including the presidential market which is taking bids on President Clinton's reelection in 1996.

To illustrate the gopher system of menus and pointers Bob took us to the Electronic Bulletin Board at Michigan. Here you can get on-line access to data files with Census statistics and monetary and national accounts information. As Bill argued, as a lecturer you can use it just before your class to have the most up to date information available to you, or you can even get your students to develop their own skills in on-line data retrieval.

The World Wide Web was described as "gopher on steroids". Here you have a contextual referencing hypertext system rather than just a series of linked menus. The "hotspots" in a document could be a word or phrase, or even a graphic image through which you would be linked to a document somwhere else on the Web. Using the Netscape browser Bob entered the URL http://www.census.gov to connect up with US Census information. The demonstration illustrated the display of documents with integrated graphics images by calling up a file relating to a part of Florida which included a map of the area. Bill Goffe's "Resources .." guide can be accessed via the World Wide Web - it is held at Bob's archive at Washington. Just enter the URL http://econwpa.wustl.edu/EconFAQ.html [Since moved to http://rfe.wustl.edu/ - Web Editor] and you will be able to view it. Bob jumped from document to document on the Web just by clicking on a variety of keywords and inline image icons. He showed us how you can use the Netscape menu to store URL addresses for future use or to retrace your steps back through a session where you have wandered around different sites on the Web.

Next Bill and Bob discussed and illustrated search engines which can be used with browsing software like Mosaic and Netscape such as the Web Crawler at Washington University (http://webcrawler.cs.washington.edu/WebCrawler/Home.html [Now at http://web.webcrawler.com/d/search/p/webcrawler/]). With a tool like this you can enter a keyword (or several keywords) and search the WebCrawler database to locate documents containing your keywords.

Lastly the presenters discussed and illustrated Bob's Working Paper Archive at Washington. Making use of software originally developed for high energy physics material Bob has establised an automated system for storing and retrieving economics working papers on the Internet. Bob showed how you could search through the abstracts on-line. Postscript files of the full papers could be viewed using a software program called Ghostscript or you could download the file and print it out at your own site. (For further information on the archive see the separate material produced by Bob and his colleague Larry Blume from Cornell University which is printed elsewhere in this issue of CHEER.)

On the Saturday of the conference there was a second session arranged around the general theme of "Computer-Aided Learning Innovations". The first presentation called "Supporting Computer Aided Instruction with the Internet" was again by Bille Goffe. After a brief introduction covering some of the ground from the previous day's demonstration he looked at a number of ways in which the Internet could be used to enhance the quality of economics education. Giving students direct access to on-line electronic datbases would obviously be of value. This would ensure that they had a better feeling for the values of economic variables and, by linking the data to graphical and statistical software where it could be displayed and analysed they could develop their quantitative skills with real world data. If full archives of classic data sets could be developed they could begin to cultivate their skills in econometrics by attempting to replicate published results. By contributing to discussion groups they could learn from others and broaden their outlook. Bill was very much in favour of having students use the full resources of the Internet to undertake projects. They could exchange ideas with students at other institutions and in other countries using e-mail and perhaps even construct their own Web pages for group projects.

Throughout his talk Bill stressed two points. First by making use of resources already on the Internet teachers could benefit from the economies of scale that the environment provided. The duplication of effort could be avoided and you could in effect "piggy-back" on someone else's work. He went out of his way at this point to commend much of the work undertaken in the UK and to point out the benefits of a sharing and cooperative approach. The second point he wished to emphasize was the need for students to develop a full range of Internet skills as part of their general education. We should try to get a clear idea of where things are heading and ensure that our students are properly equipped with the skills that they will need in the world that they will find themselves in after graduation. Thus they should know how to locate and download software and other files from remote computers and understand and use all the main Internet protocols.

In discussing Bill Goffe's talk, Betty Blecha said that she had just started to make use of Internet tools and resources in class. She gave the students ftp problems and then got them working with Netscape. She acknowledged the debt that we all owed to Bill for providing signposts for the rest of us to follow. Kathy Nance agreed that there could be benefits of the "piggy-back" approach, but things were still too loosely organised. There needed to be more planning. She said we should stop now and ask what we are doing. Sometimes she wondered if too much work was going on which could not actually be used in the classroom because the necessary hardware was not available for end users. She liked the idea of having students join electronic discussion groups but, recalling the disruption caused by a sudden influx of Mexican students onto an unmoderated discussion group last year, she warned that they must be fully prepared and briefed on how to use them properly. Students needed to understand the etiquette as well as the technical protocols. Kathy cautioned us against just letting students loose. Learners cannot always make good choices, she said. Students could easily be overwhelmed by an "info-glut". However the Internet could help break down barriers between authors and learners. Other contributions from the floor included a suggestion that we use the Internet to inform one another more about what we do on our courses, posting syllabuses, data sets, project descriptions, problems and exam questions for general use.

The second paper of the session was by Tod Porter and Teresa Riley of Youngstown State University and was a report of a study of "The Effectiveness of Computer Exercises in Introductory Statistics". First they described and illustrated the software they have developed for teaching statistics to a mixed group of students at an open access business faculty. The program which they called STATX has recently been published by West Publishing Co. It is a DOS program but has a Windows-like interface and generates questions of a number of different types on basic introductory statistics topics. Although there are a finite number of scenarios used in the examples, the numbers used in the questions are generated randomly so a student won't ever get exactly the same question twice. The computer will give immediate feedback to the student and if the wrong answer is given the computer will show the student how to answer the question correctly. Standard spreadsheet notation is used in formulae helping the students to develop their transferable skills. The program is entirely self contained with a glossary of terms, full on-line Help information and incorporating all the necessary statistical tables. A scorecard records the student marks and keeps a record of which questions have been attempted. Students can print out a screen to take with them if all else fails and they decide that they need to go and see their tutor.

After illustrating how the program works by working through some of the questions in the section on the Normal distribution Porter and Riley talked about their attempt to evaluate the effectiveness of the software and its role in the student learning experience. Their results, although far from conclusive, do offer some pointers for the rest of us who wish to undertake similar studies. The study attempted to judge the value of the program at three different levels: (1) in relation to student perceptions and attitudes - do students like the program and consider it useful from their perspective as the learners; (2) in relation to student performance - is there evidence that the program can improvde the students' learning experience by increasing exam scores or by enabling students to achieve target levels of achievement more efficiently; and (3) in relation to Faculty costs and benefits - can savings be made in the time devoted to tutorial support and student grading (does it provide a system which can offer economies of scale?). It would seem at first glance that on the first and the third point there are unequivocal answers. The student attitude survey showed that most students rated the software as helpful or very helpful in understanding the material of the course (more helpful than textbook based homework) and they believed that their performance had been increased by the use of the software. However we must be a little cautious with these results. Despite Porter and Riley's attempts to eliminate biased responses, students would have been aware that the software had been developed by someone on the Faculty and may have tended to have been more positive for that reason. The Faculty benefits and savings seem clear, especially in relation to the time spent on grading assignments where it cut the time in half. The ability to use such methods beyond the introductory level may be debatable but it is usually at the introductory level where there are the biggest groups and the greatest benefits to be had.

The effect on student performance is harder to call. The results from Exam 1 were encouraging but those for Exam 2 showed no explanatory power in the regression model. However the total number of students involved in the experiment was relatively small and it is to be hoped that Porter and Riley can continue their work to obtain more evidence of the way that student characteristics can affect perfomance.

In commenting on the paper James Clark said that he liked the program and was encouraged that West Publishers had decided to sell it as stand alone software and not tied to any particular textbook. He felt the program made good use of graphics for visualisation of concepts and liked the help buttons (although he felt they needed to be a little more context sensitive). He also felt that the program would be improved if users could rework the SAME question immediately after an explanation of an incorrect answer to help fix ideas, rather than immediately being given a new question to work on. He felt that any conclusions from the study must remain extremely tentative at this stage because of the small sample and the possible (subconscious) bias built in from the fact that they have evaluated their own software. He suggested that an independent third party should be brought in to do the testing.

The third presentation of the session was from Mike Donihue of Colby College (and now on the Council of Economic Advisers). His paper (to be published in the Spring 1995 issue of the Journal of Economic Education) was on "Teaching Economic Forecasting to Undergraduates". He claimed that his approach was unique in three ways. First, he taught his course to undergraduates in a liberal arts college. Second, the students contribute to the construction and maintenance of their own macroeconomic model. Third, the students publish their forecasts together with a commentary in their own newsletter. Donihue briefly outlined the structure of the course. He saw the students for two 90 minute sessions each week of a 13 week semester. A course in Econometric Methods was a prerequisite, although at a push it could be taken concurrently. The course was divided into two parts: an introduction to forecasting methods (covering trend fitting, exponential smoothing, ARIMA models and multiple regression) followed by the work on updating the 52 equation COLBY Quarterly Econometric model and producing the newsletter (the COLBY Economic Outlook).

The students used RATS on Macintosh computers which were networked to their dormitories. They could communicate with Donihue by e-mail and all datasets and assignments were sent to students via the network. A computer/PAD was always available in class so that methods could be illustrated interactively there and then. Students also had to use the computer and PAD in THEIR assessed presentations. Donihue admitted that the students found RATS quite a difficult program to work with, but it was necessary to have a powerful and flexible program to use with the model. He had produced tutorials in the form of Hypercard stacks to help them get used to the program's features.

At the beginning of the course Donihue gives each student a mystery series and they are required to analyse its properties in front of the class. These oral presenations were challenging to the students but those coming out of the course agreed that they were an invaluable part of the course, building communication skills as well as statistical techniques. Another feature of the course was the use of invited speakers to add a futher real world flavour. Practitioners working on commericial forecasting projects were invited in to discuss their work. It is always interesting to see how colleagues elsewhere approach the task of teaching their courses, especially in areas which are potentially quite demanding and where it is not always easy to motivate the students. Some of Donihue's ideas are quite novel and will give those of us teaching courses in similar areas food for thought.

The last of the presentations came from Phil Hobbs and myself, reporting on the work of WinEcon in a paper entitled "WinEcon - A New Generation Computer Based Learning Package for Introductory Economics". (The paper is available in printed form as a CALECO Group Discussion Paper or on the CTI World Wide Web site - the full URL is http://savage.ecn.bris.ac.uk/cticce/assa.htm/).

I began by saying something about the history and scope of the project and the Economics Consortium which has worked on it. I explained a little bit about the background to the TLTP initiative and its aims. I then moved on to talk about the structure and design of WinEcon itself. I explained that it was intended to cover the whole of the first year syllabus organised into 25 modules providing more than 75 hours of learning material. The program has been designed to be fully interactive with a highly professional graphical look and feel, to be user friendly, intuitive and visually appealing. With hypertext links, a glossary, additional spreadsheet and graphing tools and a full set of test questions WinEcon is intended to provide a complete and effective learning solution. All the leading texts have been indexed and an accompanying Workbook is to be published by Blackwells. To add further flexibility, and to avoid the "not invented here" syndrome a lecturer's interface has been provided to allow people at different institutions to customise it to their own tastes and needs.

At this point I handed over to Phil who gave a short demonstration of the program by running through some of the pages of the module covering theories of aggregate supply. Finally together we identified some of the different ways in which the program can be used such as in interactive "hyperlecturing", in lab sessions to replace tutorials, for additional individual study for remedial help or in revision, and as a method of testing understanding both at formative and summative stages.

In the brief discussion which followed both Betty Blecha and Bill Yohe commented favourably on the work that had been done and several people asked about the availablity of the program in America. (We were later able to give conference delegates a further chance to see the program and talk with us about it as we set up a demonstration on the Blackwell's stand in the publishers' exhibition).

In rounding off the session Mike Lovell announced that this was his last year as an organiser. I should like to join with those who have already voiced their thanks to him for his efforts over the years, first of all in succeeding in obtaining slots in what is a very crowded conference programme and secondly for laying on such interesting and informative sessions. Next year the ASSA meetings will be in San Francisco and Tod Porter and Teresa Riley will join Bill Yohe in arranging the AEA computing sessions. (You can email Tod at FR164801@ysub.ysu and Bill at WYohe@econ.duke.edu if you would like further information about the sessions).

As I flew home to England on the British Airways Boeing 747 I started to read the book I had purchased at Mystery Books, the specialist crime and detective bookshop just a couple of blocks away from my hotel on Connecticut Avenue. "Murder at the Margin", written by Marshall Jevons (the pseudonym of economists William Breit and Kenneth G Elzinga) is an enjoyable read and a fun way to introduce economic concepts to beginners. In the book economist Henry Spearman uses his training in the subject to solve the murder of a fellow holiday maker on a Caribbean island.

OK, so these guys beat me to it on the idea of an economist- detective story. I shall just have to get my first interactive novel up on the Internet!

References

Top | CHEER Home

Copyright 1989-2007