SED navigation bar go to SED home page go to Dataplot home page go to NIST home page SED Home Page SED Staff SED Projects SED Products and Publications Search Dataplot Pages

Extending Dataplot

Introduction Although Dataplot supports a wide range of capabilities, it is inevitable that some users will want to perform analyses that cannot currently be done directly in Dataplot.

This page discusses several approaches to extending Dataplot's capabilities. There are several possible alternatives:

  1. You may want to increase the size of the data sets that Dataplot can handle.
  2. You can sometimes implement a desired capability by writing a Dataplot macro.
  3. You may want to extend the Dataplot graphical interface (GUI) to support additional analyses.
  4. The more ambitious may in fact want to augment the the Dataplot source code.
Each of these will be discussed in turn.

1. Increasing the Dataplot Workspace Size

Modify Three Lines in DPCOPA.INC By default, Dataplot supports a workspace size of 10 variables by 1,000,000 rows for a total of 10,000,000 data points. You can use the Dataplot DIMENSION command to specify how those 10,000,000 points are divided between rows and columns with the following restrictions:
  1. the maximum number of rows cannot exceed the default value of 1,000,000. This default value can be increased by changing the value of MAXOBV in DPCOPA.INC. The upper limit for this value depends on the operating system and compiler used. For Windows using the Intel compiler, the upper limit is currently around 1,500,000. I have set this to around 2,000,000 on some Linux platforms.

  2. the maximum number of columns cannot exceed 50,000 (this value is actually the maximum number all names, i.e., all variable names, parameter names, string names, matrix names). The value of MAXNME in DPCOPA.INC defines the maximum number of names. This can be either increased or decreased.

If you do not have the ability to rebuild Dataplot from the source, contact alan.heckert@.nist.gov and request a "larger" version. I will accomodate these requests for platforms for which I have the hardware to generate a Dataplot executable.

2. Writing Dataplot Macros

Macros are ASCII Files Dataplot macros are simply ASCII files containing Dataplot commands. There is no "compilation" required. Macros are executed using the CALL command. The Dataplot directories "programs" and "macros" contain a number of sample macros. The distinction is that macros in the "programs" directory analyze a specific data set while the macros in "macros" are meant to work with a user specified data set.
Some Guidelines for Macros The following are some guidelines for writing macros.
  • All names (i.e., parameters, strings, and variables) are global.

  • There is no specific passing of arguments to macros. As noted in the above statement, all parameters and variables are global. If a macro needs a variable or parameter to be defined, simply create it (typically with the LET command) before calling the macro. The NAME command can be used to create an alias for a variable name. This can be helpful in writing macros (i.e., the name is coded in the macro with a generic name such as Y or X and the NAME command is used before calling the macro to alias the desired variable to that name).

  • If possible, try to peform data manipulations without looping through each row. As with most interperted languages, looping can be slow in Dataplot (particularly if the number of rows is large). Dataplot provides a rich data manipulation capability and it is worth the effort to use this capability in performing data manipulations rather than using the LOOP command. Note that if there are row dependencies in the data manilulation (e.g, Y(I) depends on Y(I-1)), then looping may be unavoidable.

  • The CREATE command echoes the commands you enter to an ASCII file. This can be a convenient way to develop macros. That is, use CREATE while running an interactive Dataplot session and then edit the resulting ASCII file to create a macro for subsequent use.

  • If you are writing a macro to be used by others, it may be convenient to move the macro to the Dataplot "macros" directory. Macros in this directory are automatically available to users.

3. Extending the Dataplot GUI

Dataplot GUI Implemented with Tcl/Tk Scripting Language The Dataplot GUI is written using the Tcl/Tk scripting language. The primary advantages of Tcl/Tk are portability across platforms and speed of development. For example, Dataplot uses the same Tcl/Tk scripts on Unix, MacOSX, and PC Windows platforms. The primary disadvantage is that Dataplot does not have a "Windows" look and feel on the PC or a "GNOME" look and feel that is popular with Linux programs. Although this is in fact a significant disadvantage, it is a trade-off that we will continue to make for the near/intermediate future. As we are not a software company, we simply do not have the staff or resources to develop and maintain native mode GUI's for Windows, MacIntosh, and Unix platforms.
GUI Decoupled from Dataplot Note that using the Tcl/Tk scripting language in a real sense decouples the GUI from the underlying Dataplot. Essentially, the GUI menus form commands that are sent to Dataplot and they capture the resulting Dataplot output and display it in various windows. Again, this approach has both advantages and disadvantages.
Contents Via ASCII Files Another feature of the Dataplot GUI is that the contents of the Dataplot menus are specified in ASCII files. These ASCII files can be edited using any ASCII text editor.
Two Ways to Modify GUI There are two ways of modifying the GUI:
  1. You can modify the Tcl/Tk scripts to enhance the functionality of the GUI. Note that this is recommended only if you are already proficient in the use of Tcl/Tk. Basically, if you need to ask us for guidance in modifying the Tcl/Tk scripts, then we recommend against you trying this.
  2. You can modify the ASCII files that specify the contents of the Dataplot menus.
The remainder of this section is devoted to discussing how to modify the GUI by modifying the contents of the ASCII file.
Structure of the Menu Files The directory "frmenus" is the top level directory that contains the menu files.

On the PC, the full path name is typically "C:\DATAPLPT\FRMENUS". On Unix, the full path name is typically "/usr/local/lib/dataplot/frmenus".

On the PC, the files are stored in a compact format. Each of the subdirectories contains the 3 files: "alldp.dat", "allmen.dat", and "alltop.dat". This was done on the PC because some PC file systems can claim an excessive amount of space when there are a large number of small files. This is due to the fact that the minimum amount of space required to store a file is in fact rather large in some of the earlier Windows file systems (that is, most of physical space on the disk was simply empty space). This should not be an issue on newer Windows systems (both because disks are significantly larger and the space is used more efficiently). Note that you can in fact use the compact version of the menu files on Unix systems. However, there is less value in doing so on Unix platforms.

In any event, if you want to modify the menus on the PC, download the Unix version of the menu files (these are not compacted). I recommend creating an "frmen2" directory (change FRMENU to FRMEN2 in the DP.BAT file that runs the GUI). If you prefer to re-compact the files after making your modifications, contact me (Alan Heckert) and I can send a copy of the scripts that I use to do this (these are Unix C shell scripts, I have not ported these sccripts to the PC).

The FRMENUS directory should contain the file "top.top" and about 20-30 subdirectories. Each of the subdirectories should contain a fairly large number of files (we are assuming you have the non-compacted version of the files).

Types of Files The subdirectories contain the following types of files:
  1. Files with a ".top" extension are either sub-menus that define secondary level menus or they are "discussion" menus (i.e., help or tutorial files). In either case, these files do not initiate any Dataplot action directly.

  2. Files with a ".men" extension are the menus that initiate Dataplot activity.

  3. Files with a ".DP" extension are Dataplot macro files. These are Dataplot macros initiated by one of the ".men" menus. Most ".men" files do not in fact intiate a ".DP" file (the Dataplot commands are built directly into the ".MEN" file).

"top.top" - the First Level Menu The "top.top" file contains the entries for top layer of menus in the GUI. It looks like:
This is Dataplot file   top.top

-----------------------
Files/Data       <~file\file.top>
Plot             <~plot\plot.top>
PlotMod          <~plmo\plmo.top>
DiagGraph        <~woch\woch.top>
Math             <~math\math.top>
Prob             <~prob\prob.top>
One Vari         <~1var\1var.top>
Browser          <~scre\browser.top>
Help             <~dp\dp.top>
Handbook         <~handbook\handbook.top>
Home Page        <xccl: web http://www.itl.nist.gov/
                 div898/software/dataplot/>
Qual/SPC         <~qual\qual.top>
DEX              <~dex\dex.top>
Fit              <~fit\fit.top>
Stat             <~stat\stat.top>
Rel/Extr         <~reli\reli.top>
TimeSeries       <~time\timeanal.top>
Two Vari         <~2var\2var.top>
Roadmap          <~roadmap\roadmap.roo>
Manual           <~refman\refman.top>
NIST             <~nist\nist.top>
Exit             
      
Looking at this file, note the following
  • Most of the files invoked have a ".top" extension. These implement sub-level menus. For example, "file\file.top" implements the menu defined by the file in "file'file.top" (file names are relative to the "frmenus" directory).
  • The "exec exit" function is used to terminate the GUI session.
  • The "xccl" option is used to invoke a single Dataplot command immediately.
  • The "roadmap.roo" implements the "Roadmap" menu. This is a special case that is not discussed further.
A Sample Sub-Level Menu As an example of the sub-level menu, consider "file\file.top".
----- <~file\file.top> Files

Files

User files
  Discussion            <~file\discfile.top>
  View  (LIST)          
  Open  (READ/LOAD)     
  Save                  
  Print                 
  Copy                  
  Search                

Dataplot datasets
  View  (LIST)          <~dataview\dataview.top>
  Open  (READ/LOAD)     <~dataload\dataload.top>

Re-dimension Workspace     
Delete All Variables       
Variable/Parameter Status  <~data\viewdata.top>
Rename Variables           
Generate Data and Transformations  <~data\genedata.top>
Delete parts of variables  
Pack   parts of variables  

      
This menu contains ".top", ".men", and "xcl" files.
  • The ".top" files can be either further sub-level menus or discussion menus. Discussion menus are simply help files.
  • The "xcl" option indicates that what follows is a single Dataplot command to be executed immediately.
  • The ".men" options invoke menu files that invoke one or more Dataplot commands.
A Sample ".men" File An example of a ".men" file is given here.
This is file readvafi.men--Read Variables from File
 
--------------------
 1. Read Variables from File
 2. READ  
 3.
 4. User file:
 5. @CE 1 0 1 50 ?
 6.
 7. List of variables to be read:
 8. @CE 2 0 1 50 ?
 9.
10. Number of header lines to skip:
11. @CE 3 0 1 6 *
12.
13. Type of read:
14. @CE 4 1 1 35 each column => distinct variable
15. @CE 4 2 1 35 serial read
16.
17. Format (Optional & Fortran-like):
18. @CE 5 0 1 50 *
19.
20. Subset/Except/For Specification:
21. @CE 6 0 1 50 *
22.
23.
24. Do the Read?
--------------------
SU FE
SET FILE NAME QUOTE ON
SKIP _3
SET READ FORMAT _5
@IF 4 1 READ _1 _2 _6
@IF 4 2 SERIAL READ _1 _2 _6
READ _1 _2 _6
SET FILE NAME QUOTE OFF
--------------------
WEB HELP READ
--------------------
      
The code for these menus are split into several sections.
  • The first section defines the appearance of the menu. The number on the left defines the line number of the text (i.e., "4. User file:" means the fourth line of the menu will contain the text "User file:". The "@CE" lines define the user input fields. For example,
      @CE 1 0 1 50 ?
    The "@CE" signifies a user input field. The values
    • 1 - identifies this as the first input field
    • 0 - a value of 0 indicates that there is only one input box associated with this input field, positive values identify which input box this is for that field (see @CE 4 1, @CE 4 2)
    • 1 50 - specify the starting and stopping horizontal coordinates for the input box
    • ? - the "?" is used when there is no special type of input. The following special types of input are recognized:
      • gui-character - identify plot character type
      • gui-line - identify plot line type
      • gui-ooff - an ON/OFF switch
      • gui-variable - enter a variable name
      • gui-thick - specify a line thickness
      • gui-color - specify a color
      • gui-patt - specify a pattern type
      If one of these special cases is identified, the GUI brings up a list of choices for the user to select.
  • The second section defines the Dataplot commands that are implemented when you execute the menu. The syntax "_3" means substitute the value entered for the "@CE 3" field. The "@IF 4 1" syntax means if the "@CE 4 1" entry was selected, then execute the Dataplot command entered on that line.
  • The third section defines the command that gets executed when you click the "Web Help" or "Help" button. Note that "Web Help" references the online Reference Manual that is a combination of HTML and PDF files. The "Help" references the Dataplot ASCII help files. The Web Help has a more formatted appearance and often contains sample output. However, the ASCII help files are much quicker to access.
Modifying the Menus You can modify or extend the capabilities of the Dataplot GUI by modifying the ASCII files that define the contents of the various menus. There are two cases to consider:
  1. You may want to modify an existing menu.
  2. You may want to add a new menu.
It is a good idea to document any currently defined menus that you modify in order to simplify future updates. I recommend installing any new menus under the "NIST" menu. I use the NIST menu to implement features that are of particular interest to NIST users. You can change the name from NIST to some other name that relects the fact that the menu contains local enhancements (e.g., "Local"). You may want to save all new menus in a separate directory to better keep track of which menus were written locally.

For new menus, a good idea is to copy an existing menu with a similar structure to what you want to implement. Then modify this menu to do what you want. Although we have not documented all possible syntaxes, examining a few sample menus should make the basic structure and syntax clear.

4. Augmenting the Dataplot Source Code

Source Code Available Open source software, as exemplified by the GNU foundation, is becoming incresingly popular. Although Dataplot does not explicitly utilize the GNU license, Dataplot has always made the source code available (Dataplot is trade marked but not copyrighted). Note that we place no restrictions on how you use the source code. You are free to modify it for your own purposes and are free to re-distribute it with your own applications. We appreciate, but do not require, acknowledgement.

If you do make enhancements to the source that you would like to see incorporated into the core Dataplot implementation, submit them to Alan Heckert. We will consider these on a case by case basis.

Structure of the Source Code Dataplot is in fact a rather large program. Although this can initially be intimidating, we will provide guidance on how to implement the most common types of enhancements.

Dataplot is written primarily in Fortran 77. A few graphics device drivers are partially written in C (these are interface routines between the Dataplot Fortran and a device C library).

Note that we have no plans to convert the Fortran 77 code to C. We have in fact used "f2c" to compile Dataplot on systems that do not have Fortran compilers and have found that this works well in practice. However, my experience is that it is best to view the generated C code as essentially "object code". If you use "f2c", I recommend making all source changes to the Fortran code and then running f2c as oppossed to changing the generated C code.

We have considered converting the Fortran 77 code to Fortran 90. Although this would have a number of advantages, the reason I have not done this so far is that many Dataplot users do not have Fortran compilers on their systems. Although I have Fortran 90 compilers on all the platforms that I provide Dataplot executables for, users at times may need to compile the code from source (e.g., for a site that runs an older version of the operating system).

We have split the source code into approximately 60 source files. The machine dependent routines are stored in the "dp1.f" file. A number of files for common platforms are already provided, so you should not have to modify these routines in most cases. Most of the source is in the files "dp2.f" through "dp46.f". The remaining source files contain related blocks of code and generally will not be modified. The primary C files are "x11_src.c" and "gd_src.c". These are used to implement the X11 device driver and the PNG/JPEG device driver, respectively.

The primary place that Dataplot utilizes external libraries is for supporting graphics device. These libraries are typically already provided (e.g., the "xlib" library) if the device is supported on that platform. The one current exception is that the JPGEG/PNG driver uses the GD library (which in turn utilizes 3 additional publically available libraries). I normally recommend building these 4 libraries from source rather than using the system provided versions to ensure consistency of versions.

Distinguish Different Types of Enhancements It is useful to distinguish the following types of enhancements. Each of these will be discussed separately.
  1. Add a statistic;
  2. Add a math or matrix command to LET;
  3. Add a probability distribution;
  4. Add a library function;
  5. Add a new command;
  6. Add a graphics device.
  7. Add a SET/PROBE command.
One recommendation I would make is to put any new routines in a separate file of your own. Be careful to document any current routines that you have modified. This will simplify merging your additions with any future code updates.
Adding a Statistic Dataplot supports a variety of commands that generate the value of some statistic for a variable. For example,
    LET A = MEAN Y
    LET A = CORRELATION Y1 Y2
That is, the generic syntax is:
    LET <parameter> = <name of statistic> <variable name 1> ... <variable name k>
To add a statistic, there are three routines that need to be edited/written.
  1. The routine CKSTAT (in DP4.F) parses the statistics operations under the LET command. Select a statistic with a similar syntax (i.e., the same number of arguments for the statistic name and the same number of response variables). Just follow the logic for the currently defined statistic and add your new statistic accordingly.
  2. The routine DPSTAC (in DP29.F) loops through the supported statistics. Simply add your statistic to the list.
  3. Finally, you need to write the routine that actually computes the statistic. Again, I recommend copying a currently defined statistic that has a similar structure to the new statistic. In most cases, it will be relatively straightforward to modify an existing routine to create the routine for your new statistic.
Note that the above is all that is required for the most basic use of a new statistic. However, Dataplot provides a number of commands that utilize the built-in statistics. Adding your new statistic to the list will enhance its usefulness within Dataplot. Specifically,
  1. Add the computation of the statistic to the routine CMPSTA in DP4.F. This has a similar logic to the DPSTAC routine. All the remaining routines discussed below utilize this routine to compute the statistic.
  2. The routine DPJBSP in DP17.F implements a jacknife or bootstrap plot.
  3. The routine DPSP in DP29.F implements the STATISTIC PLOT command. This plots the value of a statistic for a group for a single group id variable.
  4. The routine DPCRPL in DP11.F implements the CROSS TABULATE PLOT command. This plots the value of a statistic for a group for two group id variables.
  5. The routine DPDEXPP in DP12.F implements the various DEX ... PLOT commands. Currently, we only support the case where the statistic is computed from a single response variable.
  6. The routine DPBLOC in DP9.F implements the various ... BLOCK PLOT commands. Currently, we only support the case where the statistic is computed from a single response variable.
  7. The routine DPINCU in DP17.F implements the various ... INFLUENCE CURVE commands. Currently, we only support the case where the statistic is computed from a single response variable.
  8. The routine DPISP in DP17.F implements the various INTERACTION ... PLOT commands.
  9. The routine DPTABU in DP30.F implements the various TABULATE commands (tabulate the statistic when there is a single group variable).
  10. The routine DPCRTA in DP11.F implements the various CROSS TABULATE commands (tabulate the statistic when there are two group variables).
  11. Dataplot allows you to compute a statistic for either the rows or the columns of a matrix using the MATRIX <ROW/COLUMN> <STATISTIC> command. In this case, the parsing is done in the CKMATH routine. This case is limited to statistics that are computed from a single response variable.
  12. The LET ... = CROSS TABULATE ... command is similar to, but distinct from, the CROSS TABULATE command. The parsing for the LET ... = CROSS TABULATE command is performed by the CKMATH routine in DP4.F.
Again, for all of these the only modification that needs to be made is in the parsing of the statistic name. This is straightforward once you see the logic for the already supported statistics. The computation of the statistic is funneled through the common CMPSTA.
Adding a Math or Matrix LET subcommand Dataplot supports a variety of mathematics and matrix subcommands under LET. For example,
    LET YSUM = CUMULATIVE SUM Y
    LET A = INTEGRAL Y X
    LET YINT = CUMULATVIE INTEGRAL Y X
    LET M = MATRIX INVERSE X
    LET U S V = SINGULAR VALUE DECOMPOSITION X
Note that for adding a statistic, the left hand side returns a single parameter. For math and matrix commands, the left hand side may be a combination of one or more parameters, variables, or matrices.

To add a math/matrix command, there are three routines that need to be edited/written.

  1. The routine CKMATH (in DP4.F) parses the math/matrix operations under the LET command. Select a math/matrix command with a similar syntax. Just follow the logic for the currently defined command and add your new command accordingly.
  2. The math commands are implemented in the routine DPMATC (in DP19.F) and the matrix commands are implemented in DPMAT2 (in DP19.F).
  3. Finally, you need to write the routine that actually computes the math/matrix command. For a math command, I recommend copying a currently defined command that has a similar structure to the new command. The matrix commands are implemented in MATARI, MATAR2, and MATAR3 (in DP41.F). You can add the new matrix code to one of these routines (these were split up for convenience and to keep them to a manageable size, the breakdown is essentially arbitrary).
Note that LET subcommands that generate a sequence of data are implemented separately in routines that are called directly from DPLET (in DP18.F). I recommend copying one of the current sequence commands and modifying to suit your needs. Also, a few math commands are implemented independently from DPLET (e.g., ROOTS, INTEGRAL, DERIVATIVE). These are generally commands that have a special syntax are have a rather involved implementation.
Adding a Probability Distribution Dataplot supports a rich library of probability distributions. Supporting a probability distribution in Dataplot can include the following:
  • the cumulative distribution function (CDF)
  • the probability density function (PDF)
  • the percent point (also referred to a as the inverse cumulative distribution function) function (PPF)
  • the hazard function (HAZ)
  • the cumulative hazard function (CHAZ)
  • the sparsity function (SF)
  • a probability plot
  • parameter estimation methods:
    • probability plot correlation coefficient plot (this can be implemented for distributions with either one or two shape parameters
    • maximum likelihood or method of moment estimators
  • goodness of fit tests
    • Kolmogorov-Smirnov goodness of fit test
    • Chi-square goodness of fit test
    • Specialized goodness of fit tests such as Anderson-Darling or Wilk-Shapiro
At a minimum, I recommend implementing the CDF, PDF, and PPF functions. To add these, do the following:
  1. Write the CDF, PDF, and PPF functions. We typically use the convention of xxxCDF, xxxPDF, xxxPPF (e.g., NORCDF, NORPDF, NORPPF for the normal distribution). Note that we usually write these with shape parameters only (i.e., no location or scale parameters). Location and scale parameters are handled by the calling routine using standard formulas.
  2. Parsing of the probability function is performed by the routines CKLIB1 and CKLIB2 (in DP4.F).
  3. The CDF, PDF, and PPF routines are called from the routine DPLIB3 (in DP18.F).
If the distribution is useful in reliability applications, then you probably want to add the hazard and cumulative hazard functions. These are handled in a similar manner to the CDF, PDF, and PPF functions.

Note that once you have implemented the CDF function, then adding support for the Kolmogorov-Smirnov (DP1KST and DP1KS2 in DP8.F) and the Chi-square (DPCHSQ and DPCHS2) goodness of fit tests are essentially free. Just follow the logic for the other distributions.

Likewise, once the PPF function is implemented, the probability plot (DPPP, DPPP2 in DP22.F) and PPCC plot (DPPPCC, DPPPC2 in DP22.F) are essentially free. The PPCC plot can be implemented for distributions with either one or two shape parameters (but not more than two, and it is not applicable for distributions with no shape parameter). Probability plots can be implemented for distributions regardless of the number of shape parameters.

For continuous distributions, random numbers can be added if the PPF function has been implemented. For many distributions, there may in fact be better algorithms for the random number generation than the one based on the PPF function. However, the PPF based method provides a generic, workable algorithm if no other better method exists. To implement the random numbers, do the following:

  1. The routine CKRAND (in DP4.F) parses the
      LET Y = RANDOM NUMBERS FOR I = 1 1 N command.
  2. The routine DPRAND (in DP23.F) generates the random numbers.
  3. Write the routine for the given distribution. Our convention is to call the routine "xxxRAN" (e.g., NORRAN for normal random numbers).
As a final step, you can implement any special routines for the distibution. For example, Dataplot supports maximum likelihood estimation for a limited number of distributions. Maximum likelihood estimatation is funneled through the routine DPMLWE (in DP20.FOR). Maximum likelihood is parsed in MAINAN (in DP40.FOR).
Adding a Library Function Dataplot supports a large collection of built-in library functions. These are documented in chapter 6, chapter 7, and chapter 8 of Volume II of the Reference Manual. Chapter 8 discusses the probability distribution functions, which are discussed elsewhere.

To add a new library function, do the following:

  1. Write the routine that implements the desired library function.
  2. Parsing of the library function is performed by the routines CKLIB1 and CKLIB2 (in DP4.F).
  3. The call to the new library function will be implemented in either the DPLIB1, DPLIB2, or DPLIB3 routine (in DP18.F). The breakdown of these three routines is fairly arbitrary and is done primarily to keep the routines to a more reasonable size.
I recommend looking for a library routine with a similar syntax to the desired function and mirroring its implementation in the above routines.

The COMPIM routine (in DP5.F) is the routine that performs the parsing of expressions containing library functions (COMPIM calls DPLIB1, DPLIB2, and DPLIB3 to evaluate any library functions that it finds).

Note that COMPIM currently only evaluates a single row of variables in the function. That is, it does not currently support functions that need to operate on an entire variable (e.g., a SUM function). Note that we are currently investigating the best way to extend this (in partiuclar, we want to implement at least a SUM function to permit the OPTIMIZATION command to be used for maximum likelihood estimation).

Adding a New Command Dataplot divides commands into several general categories. Dataplot's main routine (MAIN.F) cycles through the following routines:
    MAINGR implement the graphics commands (e.g., PLOT, HISTOGRAM)
    MAINAN implement the analysis commands (e.g., FIT, CONSENSUS MEAN)
    MAINPC implement the plot control commands (e.g., TITLE SIZE, TITLE COLOR)
    MAINDG implement the diagrammatic commands (e.g., TEXT, DRAW)
    MAINSU implement the support commands (e.g., BOOTSTRAP SAMPLE, CALL, CLASS WIDTH)
    MAINOD implement output device commands (e.g., POSTSCRIPT, DEVICE 2 SVG)
A common structure is something like:
  • MAINGR calls DPHIST to implement a histogram.
  • DPHIST parses the command line. Specifically, it extracts the arguments that identify the type of histogram, it extracts the variable names, it extracts and parses any SUBSET/EXCEPT/FOR clauses, and it copies the contents of the variables into arrays.
  • DPHIST then calls DPHIS2 to compute the plot coordinates for this histogram.
  • After returning from MAINGR, MAIN.F calls DPGRAP to render the histogram on the open graphic devices.
I recommend creating a MAINLO routine that contains any locally implemented commands. Modify MAIN.F to call MAINLO. I would look for a command that has a similar structure to the command that you would like to add. Modify it to implement the desired command.
Adding a Graphics Device If you would like to add a new graphics device, first collect the following information:
  • At a minimun, you need the protocol for intializing the device, clearing the screen, moving, and drawing a line, and setting the attributes for the line. For color devices, you also need to know the protocol for defining and setting colors. I generally recommend writing a "minimal" graphics driver first that implements these basic functions.
  • The next level is to determine the protocol for supporting hardware characters. This includes drawing the character and setting its size, color, and justification. Also, if the device supports a "pixel" setting capability, this can be implemented as well. Note that you can in fact include a device driver that does not support hardware characters (use the DEVICE ... FONT SIMPLEX command to force software characters for all text).
  • Determine if the device has special capabilities that you want to support. This might include special fonts (e.g., the Dataplot Postscript driver supports 35 Postscript fonts).
  • For some devices, you need to address windowing capabilities. Note that one complication is that for some device drivers, graphics and graphical interfaces are linked. That is, you cannot use the graphic library without also building the program as a "windows" program. For example, this is the case with the Winteractor graphics library supported by the Lahey compiler on the PC. This is an issue that you may need to research a bit before trying to implement the graphics driver in Dataplot. In particular, the Dataplot GUI is written in Tcl/Tk. This may conflict with your ability to use this type of driver with Dataplot (Dataplot needs to be built in "console" mode in order to use it with the Tcl/Tk scripts).
The Dataplot graphics routines typically start with "GR" and are contained in the files DP37.F, DP38.F, and DP39.F.

Support for graphics device drivers in Dataplot was modeled after the GKS standard. Note that it not a GKS library, but it uses a similar structure.

To implement a minimal level device driver, the following routines need to be modified.

    MAINOD (DP41.F) This is the routine that parses the commands that activate (or deactive) a given device driver.
    GRSEPP (DP38.F) This routines sets characterstics of the device such as resolution (or picture points), continuity, and whether the device supports color.
    GRINDE (DP38.F) Use this routine to implement the device initialization.
    GRERSC (DP38.F) Use this routine to implement the erase screen function.
    GROPDE (DP38.F) Use this routine to open the graphics device (this will be a null routine for many devices). This routine is called at the beginning of a plot while GRINDE is called when the device is first turned on. This routine is a null routine for many devices.
    GRCLDE (DP37.F) Use this routine to close the graphics device (this will be a null routine for many devices).
    GREXIT (DP38.F) Use this routine to exit a device. This is slightly different than the GRCLDE routine. GRCLDE is called at the end of a plot. GREXIT is called when the device is closed (i.e., a DEVICE <1/2/3> CLOSE command is entered or the Dataplot session is terminated).
    GRDRPL (DP37.F) Use this routine to implement the draw line function.
    GRMOBE (DP38.F) Use this routine to position the current position on the device.
    GRDRLI (DP37.F) Use this routine to implement the draw line function. This routine differs from GRDRPL in how the attributes are set for the line. GRDRLI is used in a more limited sense than GRDRPL (basically, it is used for filling regions).
    GRTRCO (DP39.F) Translate the Dataplot color to a specific color on the device. Note that Dataplot supports 88 colors and 100 levels of grey scale. These Dataplot colors are mapped to the closest available color on the device.
    GRSECO (DP38.F) Set the color on the device. GRTRCO simply sets an appropriate index or name. This routine actually implements the color setting command. Note that a few devices need to set the color in the line drawing and text drawing routines. In this case, the color name/index is simply passed along and GRSECO is a null routine.
    GRTRPA (DP39.F) Translate the Dataplot pattern for a line or region into an appropriate index or name for the device.
    GRSEPA (DP38.F) Set the pattern for a line or region pattern for a device. Mostly, this is the line pattern (i.e., solid, dashed, dotted, etc.). Region patterns are normally not device specific. Note that a few devices need to set the pattern in the line drawing and text drawing routines. In this case, the pattern name/index is simply passed along and GRSEPA is a null routine.
    GRTRTH (DP39.F) Translate the Dataplot specification for line thickness into an appropriate index or value for the device. Note that this is only used for devices that support line thickness in hardware (line thickness will be implemented in software for devices that only support a single line thickness).
    GRSETH (DP38.F) Set the line thickness for a device. Note that a few devices need to set the thickness in the line drawing routines. In this case, the thickness value/index is simply passed along and GRSETH is a null routine.
    GRTRSD (DP39.F) This routine translates Dataplot 0 to 100 coordinates to device pixels. By default, this routine assumes the orgin is the lower left point of the device. If your device uses a different orgin, then you need to make some adjustment in this routine. For example, several devices use the upper left point of the device as the orgin.
The following are routines that can be modified to include support for additional features.
    GRWRTH (DP39.F) Routine that draws horizontal hardware characters.
    GRWRTV (DP39.F) Routine that draws vertical hardware characters.
    GRDRPH (DP37.F) Routine that draws horizontal hardware polymarkers (i.e., the text that is drawn by the CHARACTER command). Note that of the device has the capability for turning individual pixels on and off (implemented in Dataplot via the CHARACTER PIXEL command), that is implemented in this routine as well.
    GRTRSI (DP39.F) Translate the Dataplot specification of the character size for hardware text into a device specific characterization of that size.
    GRSESI (DP38.F) Set the character size for hardware text for the device.
    GRFIRE (DP38.F) Fill a region. Several alternatives need to accounted for:
    • The region can be either rectangular or non-rectangular.
    • The region fill can be either solid filled or a pattern (i.e., a mix of horizontal, vertical, and up/down diagonal lines).
    • The region fill can be performed in either hardware or software.
    For devices that support hardware region fills, I recommend adding a SET command that allows the user to specify whether hardware or software fills will be used. Although hardware fills are typically more efficient, my experience is that they can also be a bit flakey (in particular, weird results can occur if the filled region intersects other line draws on the graph). Non-solid fills are currently only supported in software.
    GRRESC (DP38.F) Use this routine to implement the CROSS HAIR command (i.e., use the mouse or some other input device to identify the coordinates of some point on the display device). This routine is only implemented for display devices.
    GRRIBE (DP38.F) Use this routine to ring the bell on the device.
    GRSAGR (DP38.F) Use this routine to implement the SAVE PLOT, CYCLE PLOT, and REPEAT PLOT commands. This is currently only activated for a few specific devices (and it only makes sense for display devices).
The following are additional routines that do not typically need to be modified. These routines often contain a skeleton structure for the different devices. However, no device specific code is typically implemented. You can add the new devices to the skeleton or you can simply ignore them.
    GRCOSC (DP37.F) Routine to "copy the screen". This was a routine used by the old Tektronix termimals that had "copy" units attached. It is generally a null routine for other devices. However, it could be used to initiate a "print screen" type function if the device has some automatic way to generate this.
    GRDETH (DP37.F) Routine the length of a horizontal text string. This routine currently only has active code for the QMS device (which is essentially obsolete).
    GRDETV (DP37.F) Routine the length of a vertical text string. This routine will be a null routine for most devices.
    GRTRCA (DP39.F) Routine to translate the case (upper or lower). Currently, this is a null routine for all devices.
    GRTRDI (DP39.F) Routine to translate the direction (horizontal or vertical). Currently, this is a null routine for all devices.
    GRTRFI (DP39.F) Routine to translate the fill pattern. Currently, this is a null routine for all devices.
    GRTRFO (DP39.F) Routine to translate the character font. Currently, this is a null routine for all devices.
    GRTRJU (DP39.F) Routine to translate the character justification. Currently, this is a null routine for all devices.
    GRSEMO (DP39.F) Routine to set "graphics" or "text" mode. This was necessary for some types of Tektronix terminals. It is not needed for most devices.

Note that many devices are supported by calling either a Fortran or C based subroutine library. For C based libraries, I have had good success by writing my own C routine as an interface to the C library. That is, the Dataplot Fortran calls my C routines which then call the C library. The advantage of this approach is that it makes portability easier by giving me more control over the types of arguments passed between Fortran and C. In particular, I only pass integers and reals between the Dataplot Fortran and my C code. The passing of strings and data structures only occurs from my C routine (i.e., at the C to C level). Although this may introduce some slight loss of efficiency, the much simpler portability makes this a worthwhile tradeoff. In particular, the X11 and GD (for PNG and JPEG) device drivers are coded this way. A similar approach is being used for a few additional device drivers currently under development. You can the files "x11_src.c" and "gd_src.c" for guidance.

Adding a SET/
PROBE Command
A number of attributes can be defined via the SET command. The SET command typically defines some type of switch. The PROBE command is used to return the current value of that switch. To implemeent a SET command, do the following:
  • Modify the include file DPCOST.INC to add the desired switch to a common block. Switches for output devices are typically defined in the DPCODV.INC include file.
  • The SET command is implemented in DPSET (in DP2.F).
  • The PROBE command is implemented in DPPROB (in DP2.F).

Privacy Policy/Security Notice
Disclaimer | FOIA

NIST is an agency of the U.S. Commerce Department.

Date created: 06/05/2001
Last updated: 09/20/2016

Please email comments on this WWW page to alan.heckert@nist.gov.