IEEE ©2001 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

Return to Table of Contents


Internet-based Learning by Doing

Luis Anido, Student Member, IEEE, Martín Llamas, Member, IEEE, and Manuel J. Fernández


Abstract - This paper presents the current trends in Internet-based training by experimental work. We show how to apply the "learning by doing'' paradigm in Internet-based distance learning, both for academic educational environments and life-long training systems, taking into account available computer and network resources. Firstly, the different phases in the learning process are introduced. The aim of this introduction is to show to the readers the importance of the learning by doing paradigm, which is not implemented in many Internet-based educational environments. Then, we identify the most important trends in this field which can be classified into two main groups. The first one consists of accessing the real equipment through an Internet interface. The second is based on simulation, very often, Java-based simulation. Both approaches are discussed, including brief descriptions of currently available systems that implement them. Finally, these approaches are compared from different points of view. We point out the most significant variables to bear in mind and, as the readers may find a tradeoff between some of them, we also provide a graphical guide to help them in their choice.


I. Introduction

The universality of the Internet makes it a practical and suitable platform for delivering courses and other educational material. Internet and Web-based education offers remote access from everywhere and at any time. Since every lecturer can place educational material on the Web, every student or trainee in the world, not only those close to that lecturer, is able to gain access to that information. As a consequence, anybody with an Internet connection could take advantage from the knowledge of the best experts in every subject.

Nevertheless, those developing courses on the Web often seem to be confused about the difference between delivering information and delivering learning. Merely providing students with information is not sufficient for learning. Good learning material considers, for example, problem solving, intuition, imagination, and creativity as important components of learning [1]. Interactivity is another key issue in both conventional and distance education. The concern of students with the learning process is drastically reduced when the level of interactivity is low. In traditional education, the highest level of interactivity is achieved in teaching labs. Students go to the lab after being taught the theoretical concepts, and there, they can put them into practice by themselves.

On the one hand, practical training in teaching labs is essential to complete the learning process. In fact, if trainees control the experiments, the instruments and the processes that are being learned, they will acquire the knowledge quickly.

On the other hand, industry demands professionals with good practical skills. Sometimes, universities are said to be too  theoretical. Teaching labs try to overcome this myth and convert their too-much-theory students into qualified professionals.

Internet-based distance learning has many advantages for students in the university or for trainees in industry. However, it seems difficult to find a satisfactory way to provide Internet learners with this essential practical training like in conventional teaching laboratories. Fortunately, we can find several systems on the Internet which cope with this issue and allow us to complete the students or trainees’ learning process.

The rest of this paper is organized as follows. First, we present the phases that can be found in the learning process. In section III, we introduce the Internet-based laboratory concept. Then, we analyze one of the  main trends for Internet-based learning by doing, the "remote access to the real equipment". The next section presents the "simulation-based" approach. In section VI, we make a comparison between these approaches. Finally, some conclusions are presented.


II. The Learning Process

The learning process initially presents the knowledge of the domain and progressively enhances a learner’s competence in the application of that knowledge in a working environment. Although the learning process takes place in an integrated exploration environment, it is conceptually divided into several sub-processes within the system.

The learning process is not assumed to be sequential in terms of these sub-processes and smooth transition from one to another should be guaranteed. The number of considered sub-processes depends on the author, but all of them show a path, from the acquisition of theoretical concepts, to their application on real systems. For instance, we can divide the learning process into the following four sub-processes [2]:

  1. Coarse grained instruction dominated learning. In this phase, learning activities are concerned with the basic knowledge of the subject matter and obtaining an overview of the domain.
  2. Fine grained knowledge construction. The learning activities target the domain at a detailed level to acquire an advanced understanding of specific areas.
  3. Cognitive skills development. The learning activities relate to acquisition of competence in the cognitive skills. These skills are acquired in a constructivist manner through training in a repetitive learning process.
  4. Application of the acquired knowledge and skills. The learning activities, based on cognitive apprenticeship, aim to identify and correct any misconceptions acquired or gaps left in earlier learning. The activities present different problems, including notoriously difficult cases recorded in this particular field, for the learner to attempt solution.
Those concepts introduced in theoretical lectures (sub-processes 1 and 2) are further investigated and expanded through examples of practical implementations (sub-process 3). Experiments prove to students that what they have been taught can be found in the real world. Thus, trainees learn, not only by listening, like in theoretical courses, but also by listening and seeing.

In addition, students can go a step further if they manipulate, control, and modify the experiment. In this way, learners interact with laboratory tools and experiments, checking what happens if they do this or that. It is the "learning by doing" paradigm (sub-process 4). This practical training is absolutely essential to educate good professionals, especially in scientific and technological fields. According to Hansen [3], students retain 25% of what they listen, 45% of what they listen and see and 70% if they use the "learning by doing" methodology, see figure 1.

Figure 1: How do students learn?


III. Virtual Laboratories, a new challenge for the Internet

There are many systems that are able to deliver theoretical courses through the Internet and present their contents using a WWW interface. Placing educational material on the Web in the form of notes and assignments is a relatively simple and quick task now, given the help provided by HTML and XML editors and several available multimedia tools. Moreover, presenting the contents in a hypermedia way, with links among the different parts, is closer to the way that humans think than the linear method used in conventional books and lectures. Web-based courseware is successful enough to provide the first sub-processes in the learning process. Now, academia and industry need to face the implementation of the "learning by doing" paradigm for their Internet-based distance education and training systems. A new challenge is to use the Internet as a virtual laboratory environment, where students are able to put theoretical knowledge acquired using Internet-based courseware into practice using Internet-based labware.

The virtual laboratory concept is quite general encompassing a range of technologies and human factors that are necessary for operation in any remote environment, whether remote in time, scale or distance. A collection of emerging technologies ranging from distributed data handling and distributed computing to the underlying network infrastructure, have the potential to make it possible to create distributed scientific and academic laboratory environments that provide a complete location independent access to instruments, data handling and analysis resources, and enable remote human collaboration. The need for a distributed laboratory environment can occur by virtue of research and engineering involving the use of large and/or scarce facilities (e.g., large electron microscopes, various types of particle accelerators, etc.); by virtue of a single experiment being unique due to its scale (e.g., a fusion reactor); or, probably most common, because the learners and instructors are at different geographical locations.

We feel there are two main approaches in Internet-based training lab environments: remote access to the real equipment and simulation-based laboratories. In the former, trainees access from their homes the actual laboratory using remote control and virtual telepresence systems to carry out the experiment. In the latter, simulators of experiments or laboratory tools are used. Both of them need to provide learners and teachers with collaborative tools to overcome the inherent geographical separation of a distributed environment, such as the Internet.


IV. Using the Real Thing

Roszak [4] points out that electronic simulations lack the "messiness" of life. They are generalizations made in accordance with value-judgements that may well ignore elements of a situation which contribute in a less than obvious way to the total picture. The significance, even the inclusion of particular elements, is a function not only of the judgment of the programmer, but also of the degree to which information is amenable to being expressed in a computational form.

Students and trainees need to use the real thing, execute commands on the real tools and modify by themselves the real experiment. Apart from the relevant costs related to the development of simulators, they can lead to oversimplification of real experiments. The assumption that simulations provide more effective contexts than the real world for the construction of knowledge in specific domains depends on an understanding that such knowledge is readily transferable to an appropriate degree and in a functional form to other contexts. This, in turn, presumes the ability of the learner to distinguish between those aspects of the simulation which apply in the world outside, and those which do not. Unfortunately, this is not always true.

Internet-based laboratories can provide access through a Web browser to the real experiment in the actual laboratory. The main idea is to send commands, which can be pre-processed on the client side, to the server, execute them on the real laboratory tools or experiments, and send their results back to the client side, showing them on a Web browser. Internet technology allows us to provide students with distance access to the actual laboratory tools and give them the essential practical skills.

This approach needs telepresence tools to allow trainees to follow the evolution of experiments. In addition, a safety system is needed in the real laboratory to avoid disastrous situations due to errors made by unskilled trainees. This safety system would lead the experiment to a stable and safe state.
 

A. Hardware-based laboratories

There are many subjects where it is particularly interesting to use the real tools or instruments that trainees will find in industry. Mechanics, Electrical Machinery, Electric Power Systems are some examples of this situation. Anyway, they can benefit from the advantages of Internet-based learning by doing, provided that the corresponding interface to the Web has been implemented such that trainees would be able to access from their own homes the actual laboratory and carry out the experiment as if they were in the real lab itself.

For instance, the Oregon State University, USA, has developed Second Best to Being There (SBBT) [5]. It provides students with remote access to a control engineering laboratory. Students are allowed to guide real experiments in the actual laboratory. It is not simulation or virtual reality. Using a client/server architecture on the UDP/IP protocol, students handle a real robot from their own homes. They have absolute control over the environment and they are allowed to do whatever action they could perform if they were in the real laboratory. SBBT experiments are oriented to control engineering: motors, robots, magnetic suspension systems, etc.

Students make use of a user interface with video, audio and data acquisition capabilities which provides virtual telepresence in the actual laboratory. Collaboration between students is also supported by a virtual whiteboard. In addition, a full safety system has been implemented in the real laboratory to avoid disastrous situations due to errors made by unskilled trainees. This safety system would lead the experiment to a stable and safe state.

Another example is the AIM-Lab [6], which is being used in the Rensselaer Polytechnic Institute (RPI), USA, and the Norwegian University of Science and Technology (NTNU), Norway. It offers an on-line laboratory for remote education on semiconductor device characterization. Students are allowed to access remotely nine different experiments which are actually performed on the HP4142B DC Source/Monitor manufactured by Hewlett-Packard. The interface towards the student is a Java applet running on a Web browser. The server, written in Microsoft Visual C++, includes the driver interface layer (DIL), which manages the tasks and communicates with the instrument driver (HPIB IEEE 488.2 standard protocol), and a TCP/IP server socket, which communicates with the client side through the Internet.

The AIM-Lab was applied to the characterization of a group of devices, including a set of CMOS devices and a LED. The laboratory came with eight experiments to be performed on a CMOS test chip designed by the AIM-Lab group. The experiments included are measurements of various combinations of device and inverter characteristics used for device characterization. The ninth experiment was performed on a LED device. The numerical values of the measured data and the plot of characteristics obtained from the experiment were presented on the Java applet window at the student side.
 

B. Software-based laboratories

It is also very typical to find situations where conventional teaching laboratories are based on software tools. Computer Science, Computer Technologies or Software Engineering are examples of such kinds of subjects. Real experiments can be performed remotely by sending commands and getting back their results.

Many of these experiments need computer and network resources which are only available at universities or research centers. Software interacts with the environment and permits students to act on it through a software interface and collect results from it. Students or trainees could have to learn how to use the software itself, for example, in a laboratory about database systems. Students can access to the actual laboratory tools via a Web browser and interact with them as if they were in front of the actual laboratory desktop.

The Real Experiment eXecution (REX) [7] model, developed at the University of Genoa, Italy, deals with execution of real experiments through the Internet. In this case, no simulation activity is performed, but real experiments are carried out through the execution of sequences of real operations. The REX model implements software interfaces capable of both acting on the real world and collecting results from it. The results are then presented in an attractive form together with related virtual teacher’s explanations and comments (the courseware acting as a tutor), but it is worth noting that the command is really executed on a real system: this fact leaves the student free to test all the possible choices associated with the command, even running into error prone situations like those in the real world.

REX is addressed mainly to software experiments, particularly to the computer science educational domain. Already developed prototypes intend to teach network services (FTP, telnet) and protocols (TCP/IP, SNMP, ARP). Students can actually use some network services; the network traffic produced by these activities is then acquired using dedicated software (Sniffer) and explained to the students.

The REX architecture is based on a traditional WWW courseware model. The interface towards the student is written in the form of Java applets executed on a Web browser. The interface towards the real world is a collection of Java and native code programs running on the WWW server, see figure 2.
 
 

Figure 2: The Rex Architecture. (Copyright IEEE 1997)

Remote access to the real laboratory equipment is also found in the Vienna University of Technology, Austria, which provides a complete Web-based database training course [8]. In addition to hypertext lecture notes, students are allowed to follow an interactive distance laboratory course. After reading and understanding the data structure on the browser, the student can proceed to the first query task, where also the expected results are shown. Students are provided with a Web interface to introduce their SQL queries, which are sent to the real database engine and executed on actual data through a CGI compliant interface. The database software runs on a Internet-enabled computer at the University facilities. A set of exercises are defined to make sure of the student's correct understanding of every concept involved in database management. After entering a possible correct answer, the student is shown alternative answers to the particular exercise. The software used includes an Oracle 7.3 Server on WindowsNT 4.0 on an Intel PC or Solaris 2.5 on a Sun SPARC workstation, the Oracle Web Server 2.1 and PL/SQL as server-side programming technique. The client software necessary to pass the laboratory is just a Web browser.

The second part of the laboratory is to design and implement a small database and to implement database interaction from the HTML user interface, e.g., to insert a form input into the database or to generate a Web page dynamically from the database contents. The implementation is done through PL/SQL programs located in the database. They are accessed through the Web Request Browser, an Oracle specific interface between the Web server and the database. This interface is CGI compliant, but much faster than the original CGI interface implementation.

Combinations of software and hardware are also found in other teaching laboratories. The University College of Dublin, Ireland, has recently developed OLIMPEX [9], a suite of automated measurement and characterization software tools for the study of semiconductor lasers and other optoelectronic devices. The software is written in the graphical programming language, LabVIEW, and controls the instrumentation over a GPIB connection. These tools, which are run on a server workstation, can be fully controlled from a WWW page. They operate in exactly the same manner except that the user is not sitting at a computer in the laboratory but instead at some remote site. Thus, it is possible to address many of the problems and limitations of optoelectronics laboratory work (e.g., device cost). While the basic characteristics of an optoelectronic device, such a semiconductor laser, may be examined using packages of cheap lasers, their performance is poor and its semiconductor structure is not degree level material. Many universities have optoelectronic research activities in which more advanced state-of-the-art lasers are being tested and studied. These laboratories have the necessary equipment to perform the more complicated characterization routines. However, the quantity of instruments and devices is usually limited to one or two, enough for research but not for undergraduate courses. With the software described in [9], a single laboratory set-up may be accessed over a network, giving multiple access to a more advanced device. The tiresome and difficult job of micro-positioning devices and lowering probes accurately to introduce controlling currents are left to a technician. The student is made aware of these problems and yet is not expected to overcome them in insufficient time and with insufficient experience. Incorrect temperature control and the introduction of current into the laser in a unsafe manner are often the causes of device damage. The software which the students access remotely will not permit the entry of values of current or temperature which may damage the laser. The device is protected from misuse of the controlling instrumentation and circuitry.

The second idea is a virtual measurement using the same interface as a real measurement; the only difference being that the data is recalled from a database of previously taken values and not measured directly from the device itself. Despite the fact that some of these devices may have been sent to other research centers or have been damaged, their data may still be used in an educational capacity. The architecture for both, real and virtual measurements, is shown in figure 3.
 
 

Figure 3: Remote and virtual measurements in the Olimpex environment. (Copyright IEEE 1999)


V. Simulation-based Learning by Doing

There is a common understanding that computer-based simulations involve the dynamic representations of processes, incorporating to varying degrees the possibility of intervention, generally referred as "interactivity", on the part of the user. Simulations of varying degrees of verisimilitude are currently employed [10] in a great diversity of situations where exposure to the "real" might be too dangerous or too costly, or where the original events are inaccessible because of constraints of time, distance or size. Indeed, it is increasingly assumed in many circles that everything worth knowing about can be modeled on a computer.

Taken to extremes, we have Gelernter’s conception of a "mirror world" [11] in which all "relevant" aspects of community life are modeled on a computer as "information" with which the individual may interact, with consequences which would take effect at a "real world" level.

In educational settings, a particular advantage of computer-based simulations is the degree to which modes of representation can be more closely specified, controlled and varied than in "real life" situations. Simulators can highlight the most important aspects of experiments, while avoiding those that are not important from a pedagogical point of view. Moreover, they could provide trainees with the best answer according to how students are using the simulator itself. In short, simulators should fit their behavior to the behavior of the students themselves in order to guide trainees in the right way.

Unlike animation, which is used to introduce theoretical concepts and to provide a description of a process that the user cannot change, simulation is controlled by the user who is able to set the values of input variables and so, to change the output. By simulation, users can modify the behavior of a process as they would be able to do using the real system.

If we look for papers in the literature about computer and Internet-based training applied to laboratories, we will find that simulation is a very common approach to provide practical training. Simulation is a valuable option in academic laboratories due to several factors:

  1. Simulation is a cost-effective solution because it needs less space and is cheaper than the real equipment used in industrial environments.
  2. Safety reasons. Unskilled learners could damage those systems used in experiments.
  3. Freedom in experiments. Incorporation of new elements or leading the experiment to extreme situations could be unthinkable with the real equipment.
The learning by doing paradigm can be fully implemented in Internet-based environments via simulation. Those virtual laboratories on the Internet which use simulation as the core of the practical training can be classified according to its architecture in two main groups:
  1. Processing on each client. Simulators of real laboratory tools or experiments can be run on students’ computers. The Java paradigm allows us to deliver software in the form of Java applets through the Internet [12] that can be run on commonly available WWW browsers in an interactive way.
  2. Processing on Server. The simulator is run on a server computer and is accessed by students remotely through the Web. The interface towards the student is a WWW browser that is used to enter commands and present their results in a user friendly way.
Both approaches can benefit from the Web-networked framework that is used to provide learners and instructors with a full collaborative atmosphere, apart from the inherent advantages of distance learning.
 

A. Processing on each Client

In order to maintain the concern of students during the learning process, it is essential to provide them with interactive tools to support the different phases in their training. The "processing on each client" architecture allows learners to use simulators in an interactive way, avoiding the network overhead.

Java fits well in this situation. Java technology takes the distributed computing a step further, allowing the delivery of software through the Internet. Embedded Java applets are run on the client’s computer. Thus, the level of interactivity is the highest, the same as those local applications stored in the user’s computer.

In educational settings, Java-based simulators of laboratory tools or experiments can be delivered to trainees. These simulators are interactively run on their computers. The same technology can be used to provide them with collaborative tools in integrated virtual laboratories. In fact, literature shows that Java computing appears to be the main support for those Web-based tele-education systems where learners acquire practical skills via simulation.

For example, the SimulNet [13] system, developed at the University of Vigo, Spain, provides a virtual laboratory to put theoretical knowledge into practice. This is done by delivering software through the Internet which can be run on any computer and operating system. These distributed applications are simulators of those tools that can be found in a conventional laboratory. SimulNet is based on three main features:

  1. SimulNet is a 100% pure Java system. Once a simulator is downloaded from the Internet, it can be run interactively, like a local application. There is no network overhead which could reduce the concern of users with the system. Therefore, students are provided with a remote access, interactive simulator of a real laboratory tool.
  2. Cooperative learning in the virtual lab. It is absolutely essential to prevent students from feeling alone while using distance learning systems. To avoid this, SimulNet provides several communication tools, both synchronous (chat, talk, virtual whiteboard) and asynchronous (mail, bulletin board, notebooks). In this way, students and teachers are able to get in touch with others overcoming the geographical separation among them.
  3. SimulNet instructors are provided with a tutoring and monitoring tool that allows them to know what their students are doing at any time. Thus, they can use this information to fit their learning material and pedagogical methodology to their students' background and characteristics. At the same time, teachers use this tool to detect students in difficulty.
The first results from SimulNet were obtained in the field of Computer Architecture by means of the simulation of several pedagogical computers used to introduce students to microprocessors.

Nevertheless, due to bandwidth restrictions, the time for downloading the needed applet could be significant enough to choose another way for delivering the simulators. For instance, simulators on CD-ROM or DVD-ROM would be run interactively and, at the same time, could use the network capabilities to connect the simulator to the laboratory system in order to benefit from the collaboration and communication facilities. However, Java applets always provide the user with the latest available version of the simulators in the Web server. This feature is lost if we choose the CD-ROM or DVD-ROM option. This trade-off must be borne in mind before making our choice. The SimulNet laboratory platform (see its architecture in figure 4) allows its users to make their own choice since there is a CD-ROM version of every available applet-based simulator.

Figure 4: The SimulNet architecture.

Another example of pedagogical simulators which run on the client side is CROS [14], developed at the University of Pisa, Italy, a simulator of a cathode ray oscilloscope. It allows students to determine how a real oscilloscope works. All the relevant parameters (theshold values, amplifications, time base settings, hysteresis) are represented on the front panel of the simulator and they can be intuitively adjusted by the user giving almost real time feedback. It can be used as a standalone tool on a PC workstation running a X11 system over Linux.

Besides the possibility to work in a standalone mode, the oscilloscope simulator is also capable of Internet-based interaction between a teacher and a student via a TCP/IP connection, with the teacher’s simulator in master mode and the student’s simulator in slave mode, see figure 5. The teacher selects inputs and configurations to pose a problem to be solved by the student. The configuration of the master session is fully transferred to the client, and now the student can experiment with commands to find a solution. Asynchronously, the teacher can get the status of the slave to watch as the student progresses. The two sides can communicate with each other by sending simple messages, so that the teacher can help and drive the student towards the correct solution.

Figure 5:  Example of a client/server network session in the CROS environment.(Copyright IEEE 1998)
 

B. Processing on Server

Another approach is based on running the simulator on a server computer. Students, using their personal computers, would send their commands through the Internet to be run on the server simulator. After processing the command, the reply would be sent back to the student’s computer. Obviously, there is a network delay which is not found in the "processing on each client" architecture. Indeed, this delay also appears in the remote access to the real lab tools, both hardware and software. However, the computer resources required could make it unfeasible to run the simulator on the trainee’s computer. On the other hand, it could be not worthwhile to run the simulator on the learner’s computer interactively if the time needed to download the simulator is very high. Anyway, students can benefit from currently available WWW technologies to preprocess their commands, to provide a user-friendly interface and to improve the network overhead.

Currently, we can find Internet-based practical training systems based on simulators which are run on a server computer. For example, WebMath [15], developed in the University of Illinios at Urbana-Champaign, USA, is an on-line, asynchronous learning and simulation environment for antenna education. WebMath accepts standard common gateway interface input from forms and Java applets and applications. The system permits the on-line execution of Mathematica commands with immediate or deferred output, and acts as a front-end to external computational engines. Beyond conventional Mathematica kernel output, WebMath provides optional animations and virtual reality modeling language renditions. As the system is tailored for courses with a significant computational analysis and design content, WebMath developers chose a commercial software, Mathematica, to be run on a server computer. Over it, WebMath packages that illustrate the fundamental properties of basic antenna types have been developed. The interface towards the student is based on Java applets and applications allowing, in this way, Web-based remote access to the simulator. The capabilities of WebMath as a simulator environment are shown in figure 6.

Figure 6: Capabilities of the WebMath simulation environment. (Copyright IEEE 1998)

The Common Gateway Interface (CGI), as a way to make the WWW a highly portable remote display interface, had been already used by Ibrahim [16] at the University of Geneve, Switzerland. He allowed his first year students to have not only on-line access to a hypertextual version of the book used in the Data Structures class, but also to simulate the algorithms that were described in that book. The students were allowed to run, on the server, the simulation and interact with it to put breakpoints in the code, display the contents of variables and advance execution either step by step, or until a breakpoint is met, in much the same way as with a symbolic debugger. The interface towards the students was in the form of HTML pages shown on their browsers.

The main problem Ibrahim had to face was to monitor the execution of the programs that actually implement the algorithm to be simulated, taking into account the fact that the http protocol is stateless, that is, the server does not remember former queries when it processes a new one. In order to solve this problem, the WWW server would first invoke a shell script through the CGI that would analyze the query sent by the client viewer and either spawn a new process, whenever a new simulation were started, or pass the query to the spawned process, when the user were stepping through a simulation already started. A portion of the URL used to refer to an example program would thus contain the process ID of the application. This process ID, along with the machine name, could be used to pipe information between the server script and the appropriate spawned process, thus allowing the server to run many such spawned processes at the same time, possibly on different machines.

Finally, there exist educational platforms that provide the needed technology to support both approaches: client-centered and server-centered simulations. That is the case of the SPU simulator of a stock market. Unlike the rest of systems presented in this paper, SPU simulator [17], developed at Seattle Pacific University, USA, has nothing to do with Engineering or Computer Science.

A challenge in teaching introductory microeconomics is making the idea of supply and demand come alive from students. A good way to do this is to let the students experience a market where buyers and sellers try to find each other and reach mutually agreeable prices. This is what can be identified as laboratory in Economics. The general structure of the computerized trading system is as follows:

  1. The student observes market data. This consists of two types of information: current offers and completed transactions.
  2. The student makes an offer. Each offer consists of the following information: whether it is a buy or sell offer, the price of the offer and the quantity of shares in the offer.
  3. The system processes the offer and returns data. Basically, it matches buy offers with sell offers.
SPU simulation makes use of different WWW technologies to implement this trading system:

1) Processing on server. The market data is on the server. Students observe the data, which is transferred to their client machines either through their Web browser or by email. An offer is transmitted back to the server, from the browser or through email. The server processes the offers, matching them where possible into completed transactions. The updated information is transmitted back to their students through an updated Web page or through email. Common Gateway (CGI) and Active Server Pages (ASP) technologies are the basis for this implementation.

2) Processing on each client. Students observe the market data on their machines. An offer is transmitted from the client to the server. The server echoes the offer back to all connected clients. The client machines receive information about offers from the server, then the clients handle the processing and displaying of results. Java applets and servlets are the basis for this implementation.


VI. A comparison between the main trends for Internet-based learning by doing

There is no general rule to state which one of the different trends in Internet-based learning by doing is the best one. It mainly depends on the particular subject being learned, the computer resources at the server and client side, and the network overhead. We will try to outline the general guidelines that can be followed to implement the learning by doing paradigm in an Internet-based environment. It can be applied both for the university or industry.
 

A. The real thing vs. Simulation

There are several key points that must be taken in mind before choosing between remote access to the actual laboratory tools or the use of simulation. Our particular needs should say the final word.

In order to make our choice, we should take in mind all these points. We should weigh the most relevant aspects in our particular case. For instance, small schools or companies hardly could afford expensive electric power machinery, and they should give more importance to the pedagogical functionalism of Web-based teaching simulators. On the other hand, those companies that want to benefit from the advantages of Internet-based learning to train their highly-qualified employees in the use of new equipment, could use the real equipment directly. The risk of damage for this new equipment is reduced if it is used by qualified professionals, even with little guidance. The time of transition from training to high performance work is reduced and, therefore, the costs of training are also cut.

Anyway, we usually find a trade-off between two or more of these variables. The graphics in figure 7 divide the problem space into two regions, one for simulation, one for real thing-based labs. We have considered the following variables: pedagogical value of simulators (e.g., their capabilities to guide students in the learning process), effort needed to develop a simulator (e.g., human and economical costs), practical skills acquired using the real equipment as contrasted with those acquired using simulators (the higher the former, the higher this parameter) and finally, the cost of the real equipment.

Of course, it is extremely difficult to make these graphics quantitative - we only have tried to make them a qualitative help for those choosing between the two approaches to Internet-based learning by doing.

If we compare the cost of the real equipment against the pedagogic value of simulation (top left graphic) there are situations where simulation is a must, independent of the characteristics of the real thing. These situations are typically related to novice learners or to a first approach to a given problem or system, where controlled, simplified experiments are best suited. If the pedagogical value of simulation is not the main variable to bear in mind, the real equipment will be the best choice if affordable. When costs grow beyond a certain limit, simulation will be again the only choice for obvious reasons.

The second graphic (top right) does not consider pedagogical or educational reasons, but only the costs associated with both approaches. Typically, there is a tradeoff between both possibilities. Simulation can be used to overcome the cost of the real laboratory equipment. Nevertheless, simulation may become unfeasible because of the resources needed to develop, implement and test the simulator. Note that in this graphic we assume that the real equipment is accesible, and we consider simulation as a way to overcome the problems related to the use of the real thing (e.g., safety reasons, destructive errors, etc.)

If we represent the programming effort to develop a simulator versus its pedagogical value (bottom-left graphic), we see that if the values of these variables are both low, we could choose any of our two approaches. However, when the pedagogical value of simulation grows, it is worth developing a simulator provided the corresponding effort does not exceed our own resources (programming effort, financial costs, human and/or computer resources, etc.).

The bottom right graphic considers the effort to develop the simulator vs. the practical skill that can be acquired using the real thing. If both variables are low, simulation appears as the best choice. Nevertheless, if any of them increases enough, simulation may become unfeasible because of its associated cost or because of the educational value of the real equipment.
 
 
 
Figure 7: Simulation vs. The real thing

 

B. Internet-based teaching simulation: Client-centered vs. Server-centered

The remote access to the actual lab equipment through the Internet suffers from a network overhead that reduces the interaction between the learner and the system. The simulation approach, taken to the client side, does not present this problem. The execution of simulators in the students' computer gives them the possibility to use the system interactively. The literature shows many systems that use Java-based simulation via Java applets. Typically, the simulator itself comes along with other applets responsible for providing cooperative and collaborative learning tools.

There exists an important point that must be kept in mind before choosing an applet-based lab system: the time needed for downloading the applets may be extremely high. Obviously, after being downloaded, applets can be run interactively on the learner's computer. An important question here is how long will the learning sessions be and what is the level of interaction needed for our particular simulator? If the number of commands executed on the simulator is low, we would prefer the network delay for these commands instead of an initial downloading time. Likewise, if common learning sessions are short, it is not worthwhile to wait until the simulator is downloaded. Similarly, if I had to travel from New York to Los Angeles I would take a plane despite the 1 or 2 hours waiting time at the airport before taking the plane. Eventually, I will arrive earlier than if I took my car from my garage with no initial waiting time. However, if I had to go to a city which is 100 miles away, I would take my own car because I would get there driving during the aggregate 2 hours-waiting-time at the airport.

An alternative solution could make use of CD-ROM or DVD-ROM-based labs. Here, the interactivity is the same as in the case of Java applets. In addition, we could connect the simulator to the server side and benefit from a collaborative atmosphere. Unfortunately, we also have here a trade-off since the applet-based approach allows us to deliver the latest version of the software. Every change in every simulator or collaborative tool is automatically updated through the Web server in every client machine. We would have to edit a new version of our CD-ROM or DVD-ROM software and deliver it every time we want to update our lab system.

The other approach for the simulation-based labs is the server-centered architecture where simulators are run in a workstation computer and trainees access to it through a Web interface. There are several reasons that could lead us to choose this architecture:

Again, the final choice will depend on our particular needs and resources: the performance of our network and computers, the level of interactivity required by our simulators, the size of our simulator and time for downloading it, how often we update the software, etc.

A trade-off among the parameters discussed above is usually found. In the next example, we will focus on a situation where we already have a simulator that is not Java-based  which can be run on a server computer. We will not take into account the time needed for downloading the software to the client side and. We only consider the network delay in the execution of commands following a request-reply scheme. In figure 8, we divide the problem space into two regions: one for client-centered simulation and the other one for server-centered simulation. We take into account three parameters: the ratio between the available computer resources at the server side and client side (better the server computer, higher this ratio), the network delay in the execution of every command, and the programming effort needed to develop a client-centered simulator (for example a Java-based one).

We will comment briefly the these figures. If the network delay is valueless (top left figure), the best solution will be that based on the computer with better available resources (client, server) because simulator performance and response time will depend only on that. As the network delay increases, client-centered solutions will be desirable despite the small increments on the performance of the server computer. The reduction in response time will compensate the loss of performance due to the lower resources available at the client. When the delay introduced by resource limitations at the client surpasses the delay introduced by the network, a server-centered solution will be our choice.

If we take into account only the resources needed to develop the applet version of the simulator (top right figure), the client-centered solution will be feasible if the server resources vs. client resources ratio guarantees no or little loss of performance, and the development of the Java version is possible.

In the third figure (bottom left), we take into account the programming effort vs. network delay. For small network delays, the server-centered solution will be preferred because of the programming effort to develop a client-centered version. As the network delay increases, the development of a client-centered version of the simulator, while feasible, will compensate this network delay to maintain the desired degree of interaction for final users.

The three 2D graphics can be combined into a 3D graphic which, according to our particular situation, will advice us of the best solution (bottom right figure).
 
 
 
Figure 8: Client-centered vs Server-centered simulation


VII. Conclusions

The increasing performance of computer networks, both in speed and reliability, allows us to hope for a promising future in the labware field. Current trends in Web-based learning methodology present different advantages and disadvantages according to their philosophy and underlying architecture. We should choose that approach which best suits our resources and needs.

In the near future, high speed Internet and broadband networks will provide full interactive access to the actual laboratory resources or simulators with high CPU and memory requirements running in server workstations. Furthermore, such quality networked environments will also provide us with a framework where real time teleconference and multimedia collaborative tools will take the remote lab users to a real virtual lab atmosphere.

As a consequence, centralized laboratory tools and high performance software could be accessed from the learners’ own homes. In this way, we will benefit from both Web-based distance learning advantages and high interactivity. In short, we will be allowed to provide our students or trainees with an Internet-based high quality learning by doing.


References

  1. A. Bork and D.R. Britton Jr., "The Web Is Not Yet Suitable for Learning," Computer, Vol. 31, No. 6, June 1998, pp. 115-116.
  2. K.R. Oppermann, R. Rashev and H. Simm, "Interactive Simulation Based Tutoring System with Intelligent Assistance for Medical Education," Proceedings of ED-MEDIA & ED-TELECOM 98, Freiburg, Germany, June 20-25, 1998, pp. 765-771.
  3. E. Hansen, "The role of interactive video technology in higher education: Case study and proposed framework," Education Technology, September 1990, pp 13-21.
  4. T. Roszak, "The Cult of Information: The Folklore of Computers and the True Art of Thinking," Pantheon Books, New York, 1996.
  5. B. Aktan, C.A. Bohus, L. A. Crowl and M.H. Shor, "Distance Learning Applied to Control Engineering Laboratories," IEEE Transactions on Education, vol 39, no 3, August 1996, pp 320-326.
  6. H. Shen, Z. U, B. Dalager, V. Christiansen, O. Strom, M.S. Shur, T.A. Fjeldly, J-Q Lü and T. Ytterdal. "Conducting Laboratory Experiments over the Internet," IEEE Transactions on Education, Vol 42, no 3, pp. 180-185, 1999.
  7. M. Chirico, F. Giudici, A. Sappia and A.M. Scapolla, "The Real Experiment eXecution Approach to Networking Courseware,"IEEE Transactions on Education on CD-ROM, Vol 40, no 4, November 1997.
  8. K.M. Göschka and E. Riedling, "Web Access to Interactive database Training: New Approaches to Distance Laboratory Work at the Vienna Universisty of Technology", Proc of Teleteaching’98. XV IFIP World Computer Congress, vol 1, pps 349-360, August-September 1998.
  9. J. Dunne, T. Farrel, D. McDonald and R. O'Dowd, "Introducing Optoelectronics Through Automated Characterisation of Devices and Virtual Measurement Software,"IEEE Transactions on Education on CD-ROM, Vol 42, no 4, November 1999.
  10. C. Downling, "Simulations – New ‘Worlds’ for Learning?," Proceedings of ED-MEDIA 96 & ED-TELECOM 98, Boston, Mass., USA, June 17-22, 1996, pp. 344-349.
  11. D. Gelernter, "Mirror Worlds," New York: Oxford University Press, 1991.
  12. "The Java Factor, " Communications of the ACM, Vol 41, No 6, June 1998.
  13. M. Llamas, L. Anido and M.J. Fernández, "Student Participation and First Results from SimulNet, a Distance Access Training Laboratory," Proc of Teleteaching’98. XV IFIP World Computer Congress, vol 2, Vienna (Austria) and Budapest (Hungary), August, 31st to September, 4th, 1998, pps 615-626.
  14. R. Giannetti, "An Analog Oscilloscope Simulator with Internet Interaction Capability for On-line Teaching", IEEE Transactions on Education on CD, November 1998.
  15. S.E. Fisher and E. Michielsen, "Mathematica-Assited Web-Based Antenna Education, " IEEE Transactions on Education, vol 41, No 4, November 1998.
  16. B. Ibrahim. "World-wide algorithm animation" Proc. First International World Wide Web Conference Geneve, Switzerland, May, 1994.
  17. D. Downing, "Computerized Trading System for Stock Market Simulation", Nothern Arizona Web-Based Courseware Conference, June 1997. http://paul.spu.edu/~ddowning/stsimpap.html



Author Contact Information

Luis Anido
University of Vigo
Dpt. Tecnologías de las Comunicaciones
Campus Universitario s/n, Lagoas-Marcosende
E-36200 Vigo (Pontevedra), SPAIN
Phone: +34 986 812174
Fax: +34 986 812116
E-mail: lanido@ait.uvigo.es

Martín Llamas
University of Vigo
Dpt. Tecnologías de las Comunicaciones
Campus Universitario s/n, Lagoas-Marcosende
E-36200 Vigo (Pontevedra), SPAIN
Phone: +34 986 812171
Fax: +34 986 812116
E-mail: martin@ait.uvigo.es

Manuel J. Fernández
University of Vigo
Dpt. Tecnologías de las Comunicaciones
Campus Universitario s/n, Lagoas-Marcosende
E-36200 Vigo (Pontevedra), SPAIN
Phone: +34 986 813777
Fax: +34 986 812116
E-mail: manolo@ait.uvigo.es


Author Biographies

Luis Anido, received the Telecommunication Engineering (1997) degree from the University of Vigo with Honors by the Spanish Department of Science and Education and by the Galician Regional Government. He joined the Telecommunication Enginering faculty of the University of Vigo where he is working towards his PhD. In addition to teaching, his main interests are in the field of New Information Technologies applied to distance learning and in the area of Formal Description Techniques.

Martín Llamas, received the Telecommunication Engineering (1986) and Doctor in Telecommunications (1994) degrees from the Technical University of Madrid, Spain. In addition to teaching, he is involved in research in the areas of Formal Description Techniques and Computer Based Training. He is the head of the GIST research group and New Technologies Division at the University of Vigo.

Manuel J. Fernández, graduated from the University of Santiago de Compostela, Spain, with a Telecommunication Engineering degree in 1990, and from University of Vigo with a Doctor in Telecommunication degree in 1997. He is actively involved in researching in the area of multimedia applications, Computer Based Training and Formal Description Techniques.


Return to Table of Contents