EPICURE Design Note 84.2<P> <b> SWIC Datapooling Services</b>

EPICURE Design Note 84.2

SWIC Datapooling Services

S. A. Ramsey

Fermilab Research Division

E.E. & E. Dept. Controls Group

MS 220 P,O. Box 500 Batavia, IL 60510

July 11, 1989

Background

The word SWIC is an acronym which refers to a Segmented Wire Ionization Chamber, an instrument which is used in experimental areas to monitor the profile of a high energy charged particle beam of intensity ranging from 1 10 to 3 10 particles per beam spill. A SWIC is simply an airtight metal box containing two planes of 48 parallel high-voltage wires each. The wires are vertically oriented in one plane, and horizontally oriented in the other plane. The box is filled with an Ar--CO gas mixture. When high energy charged particles pass through the chamber, the argon atoms ionize, giving off electrons. The carbon dioxide in the mixture prevents the electrons and the argon ions from recombining. Each charged particle will ionize roughly 50 argon atoms, so by measuring the number and distribution of electrons in the chamber (which should be in a one-to-one ratio with ionized argon atoms once background noise is subtracted), a profile of the charged particle beam can be obtained. A beam profile can also be obtained by measuring the distribution of argon ions in the gas mixture. The SWIC measures charged particle distribution in the following manner: when a high electric potential is applied to the two wire planes, charged particles are attracted to the wires. If the wires are negatively charged, then they attract argon ions. If the wires are positively charged, then they attract the electrons liberated from the argon gas by the charged particle beam. In this fashion, each of the wires in the device gains or loses charge relative to the intensity of the charged particle beam in its vicinity. The high-voltage wires are allowed to accrue electric charge for a specified period, called the CHARGE time. Hence, each individual wire may be thought of as a capacitor whose charge is an element of a vertical or a horizontal beam profile. In other words, to the outside world, a SWIC is essentially a set of 96 high-voltage capacitors.

The function of the SWIC is to translate beam intensity into elecric charge. But that alone does not provide us with a human-readable beam profile. As the high-energy charged particle beam passes through a SWIC, the device which actually samples the charges collected on the high-voltage wires is called a SWIC Scanner. The SWIC scanner is more precisely described as a ninety-six channel, charge integration-type A/D device based on the Z80 microprocessor. It is capable of carrying out the following data-aquisition tasks:

At present, there are several different varieties of SWIC scanners in the field. The original SWIC scanner and the SWIC scanner II are the two types of scanners that are presently in use in beamline operations. Fortunately, the two scanners behave similarly enough that the SWIC datapooling services application will not have to differentiate beween them. At present, it is not envisioned that the datapooling application will need to support the SWIC scanner III presently in use in the Accelerator division , although the application will probably be able to handle varying-length header information with little or no modification.

In the past, user interaction with SWIC scanners has generally fallen into two broad categories: device control and data aquisition. Many devices, including SWIC\ scanners, were controlled interactively under the EPICS system. Users had the capability of remotely setting and configuring SWIC scanners using the SET and RSD commands. Device control was facilitated by the CAMAC serial link and the 032 and 036 CAMAC modules. Monitoring of SWIC scanner data was made possible by the scanner's video display hardware and the use of television monitors. Users archived SWIC profiles by photographing the video displays. When necessary, remote aquisition of scanner data was carried out by block data transfers over the CAMAC\ serial link. For these operations as well, the CAMAC 032 and 036 modules were used as buffers. Both device control and data aquisition were possible from remote nodes under the EPICS system.

Under the present EPICURE system, control of SWIC scanners from remote DAR-serviced nodes is made possible by the SWIC Reboot and Modify Program by Stephen Baginski. Refer to his design note for more information on the control and configuration of SWIC scanners under EPICURE. But while software exists to reboot and configure SWIC scanners, an application to facilitate simple and transparent remote aquisition of SWIC scanner data by DAR-serviced nodes has not yet been implemented.

Objective

Because of the high demand by users for SWIC scanner data, it is desirable that the service of making such data available to the users of the present data aquisition and control system be continued and improved in accordance with the new data aquisition resources available under EPICURE. Because a program to control and configure SWIC scanners has already been written, this design note focuses on the related but separate task of reading SWIC scanner data. At present, there are over 120 SWIC scanners in the field, each of which may be used to monitor more than one SWIC device. If we were to instruct every SWIC scanner in the field to spit out 10 digital beam profiles and calculated data in one beam cycle, we would be receiving a minimum of 140,000 bytes of data. This represents a set of all possible SWIC data for a particular beam cycle. Our objective is to provide users with the greatest possible subset of all SWIC scanner data in a given beam cycle, subject to the following constraints:

Hardware Problems

The SWIC scanner presents a number of unique problems in meeting this objective. These difficulties fall into the general categories of configuration, speed, and data. When a scanner is in its CHARGE phase, it will not allow any reads of any profiles until it enters the HOLD cycle. However, it is impossible to remotely determine (over the serial link) by a read operation either the CHARGE or the HOLD settings of the SWIC. This makes it very difficult for an application to know when to request data from a SWIC scanner. It is also impossible to remotely determine the POLARITY, MODE, and method of clocking (internal or external) of a scanner. Furthermore, all settings on the scanner can be changed or wiped out by a reboot or by external programming, including the device's internal NAME and CHARGE/HOLD times. About the only things that we can remotely determine about a scanner are its vertical and horizontal gain and status, which are returned as header bytes with all data blocks returned. Hence, the device hardware prevents interleaving of reads with scans, (i.e: reading during CHARGE time) or any other sort of synchronization with the scanner. In this respect, we are confined to reading the scanner ``blindly,'' entirely dependent upon the operator to ensure that the device is clocked, configured, and running properly.

Another problem that one encounters reading SWIC scanners is that their response times from a command to output data blocks (see SWIC documentation for more information) are too slow. Table 1 lists response times for the SWIC scanner to three different types of data requests.

An application cannot do a set operation to a SWIC scanner and then immediately check for data in the read FIFO, because in all probability the data won't be there (unless an error occurred, in which case the response times are on the order of 0.4 milliseconds.) Instead, the application must wait until the scanner has deposited the requested data into the read FIFO, which can take anywhere from 1.8 ms to 6.3 ms. In order to avoid this problem, the VAX must interleave its set and read requests in order to avoid getting a ``No-Q'' error from the 032 module. It also takes a minimum of three read operations to grab all of the data from the SWIC because the SWIC\ will not spit out more than 5 scans at a time. The slow response time of the scanner poses timing difficulties for a data aquisition application.

The status information returned in the header bytes of data read from a SWIC scanner tells only whether or not the scanner is ready to send data; it tells nothing about what sort of problems the scanner might be encountering in servicing an application's data request. This forces the application to distinguish for itself the severity of an error and whether or not it should retry. Finally, SWIC scanners are unable to return reliable statistical calculations for some beam profiles, particularly when the profiles have multiple peaks.

Because the 032 is basically a FIFO buffer, reads from the 032 are necessarily destructive. This means that users cannot be allowed to access the 032's every time that they are interested in obtaining SWIC scanner data, because if one user read the contents of the read FIFO, then no other users would be able to obtain the data. Another problem is that in order to get the scanner to deposit its scans into the read FIFO, a CAMAC write operation must be performed which will upset the write FIFO if the control program is simultaneously attempting to configure the scanner. Write accesses to 032 modules will have to be interleaved between SWIC scanner control and SWIC scanner data aquisition tasks in order to avoid this problem. These shortcomings of the 032 module are among the primary difficulties in designing an application which will meet our objective of making SWIC scanner data available to users of EPICURE.

Solution

It has been suggested that a pool of all, or some large subset of, available SWIC scanner data be maintained in VME common memory on one or more -VAX front-ends, where the Data Aquisition Server (DAS), the datapooling process, and an 032 device handler on a 386 module (the DAE) in the VME crate can access it. Such a pool would ideally be updated every beam cycle for all scanners in the field that have scan data defined. Users could then access scanner data by regular EPICURE data aquisition requests. The only difference would be that the data would move over the QVI and DECnet, with no CAMAC operations involved. Multiple requests for some block of data at the same time would be easily serviced by DAR, as opposed to the situation with reading directly from the 032's where multiple reads of the same data are impossible. This service would be called the SWIC Datapooling Service.

This datapooling strategy still poses the problem of obtaining over 140,000 bytes of data each beam cycle from 120 SWIC scanner units down the beamlines. Ideally, a block data transfer network consisting of as many as 6 ARCNET links would make available all SWIC scanner data to the front-ends each beam cycle. The data transmission rate of the ARCNET-based system would be far greater than that of CAMAC serial link, 1 Mb/s. The network would also support transfer of large blocks of data, unlike CAMAC. Unfortunately, RDCS will not be able to provide such a network to its users for the upcoming run, although the hardware to support it has been designed. However, because there is a possibility that a block data transfer network might be implemented in the future, any solution to the problem of moving SWIC scanner data from the 032's to the front-ends must (1) be considered temporary, and (2) be designed so that a minimum of code need be modified if and when the ARCNET links are set up. In other words, the data pooling service must act exactly as the block data network would so as to maintain transparency to the user and to facilitate an easy transition if such a network is installed.

Because we have an immediate need for SWIC scanner data during the next run, the CAMAC serial link will be utilized in order to meet this need. Unfortunately, the serial link was not designed with this kind of repetitive, large block data transfer in mind. In fact, if SWIC data blocks, some of which are over 490 bytes long, were transmitted over the serial link during the flat top, users' requests might not be processed in a timely manner. During periods of heavy use, it is conceivable that moving upwards of 140K during the flat top might put a strain on the capabilities of the CAMAC serial link. Two strategies will help minimize the load that the data pooling service will place on the serial link. First, the datapooling service will move data during the period immediately following the flat top, say, after T +2, when there would presumably be few data aquisition requests. Second, the datapooling service will break its CAMAC 032 reads down into blocks of data of size ranging from 110 bytes to 174 bytes. This will allow other users' requests, should users wish for requests to be serviced after T +, to be interleaved with those of the program. It is intended that as much SWIC scanner data as possible will be pooled. Whether or not the datapooling service will reside on more than one front-end has not yet been decided. By distributing the 032 block data reads over the three front-ends, the load on DAR's internal memory might be reduced, program performance would be improved, and the data transfer load could be distributed more evenly among the three CAMAC serial links.

Structure

There are seven elements to the proposed SWIC datapooling software project:

Front-End Application and Input File

The SWIC datapooling application will consist of a non-interactive process that will be installed as a detached process on one or more -VAX front-ends at boot-time. The process will read the contents of a protected ASCII file locally resident on the front-end that will contain a list of the 032 modules that the process is to set and read (based on the host node), as well as timing and error retry information. It is probable that additions to the text file containing 032 device information will be made as the program takes shape. After reading in the necessary beam profiles from the 032, the application will deposit them in VME common memory for users to read through DAR. This requires the partitioning off of a section of VME common memory at boot-time (which is done automatically by commands in the Common Memory Initializer Parameters File) to accommodate the incoming scanner data. When all relevant scanner data is maintained in a pool in common memory, reads of the same data by many users will not bog down the CAMAC serial link, because the data must simply be transferred through the QVI and over DECnet. Because it is non-interactive, the application will not be able to give ``screen'' alerts of errors and warnings. It will instead redirect error messages to an error log file and to OPCOM (see Errors, section 6.) The format of the input file will be as follows:

!+ ! TITLE: SDP_INPUT.DAT ! ! VERSION: 1-003 ! ! FACILITY: SWIC Datapooling Services ! ! ABSTRACT: This input file contains the names of the CAMAC 032 modules ! that each of the front-ends is supposed to read. It also ! contains information on start times, error retry, and 032 ! grouping. ! ! ENVIRONMENT: Input file to SWIC.EXE ! ! AUTHOR: S. A. Ramsey ! ! CREATION DATE: 27-Jun-1989 ! ! MODIFICATION HISTORY: ! ! VERSION DATE BY DESCRIPTION OF MODIFICATION ! 1-001 27-Jun-1989 SAR Original. ! 1-002 10-Jul-1989 SAR Changed environment to SWIC.EXE ! 1-003 18-Jan-1990 SAR Updated for new file format ! !- \ + ! Beginning of file TSTART=200; ! Start data collection (time) TEND=3000; ! End data collection (time) NRETRY=50; ! Maximum number of retries PERGRP=1; ! Number of SWIC records per group POOLNO=0; ! Common memory datapool number DRETRY=2; ! Default CAMAC read retry QVILOCK=3; ! Default retry for QVI locking \ * ! Record delimiter string NAME=EC032; ! 032 Device name RETRY=2; ! Maximum number of retries NODE=BUGS; ! Which node will pool data INDEX=1; ! Channel number in common memory \ * ! Record delimiter string NAME=NE1WC1032; ! This is the second record RETRY=2; ! in the input file SDP_INPUT.DAT. NODE=DEWEY; ! It is in a separate group from INDEX=2; ! Common memory channel number \ - ! End of file.

Database Entries

At present, there is one database device entry for each 032 module of interest. Sets to the associated SWIC scanner are performed by a call to ds_add_request_...() with the property SETTING. A handler in the VME crate processes the write request and copies the data to the write buffer of the 032 module with the CAMAC operation F16A0. This handler is already written, and permits both reads from and writes to and from the FIFO. Reads of FIFO byte count are performed by a call to da_add_request_...() with the property READING. Again, the device handler translates the request to a CAMAC F1A0 command which is then sent to the 032 module. Reads of SWIC scanner data from the read buffer of the 032 are accomplished (when the buffer has data in it) by a call to da_add_request_...() with the property SETTING. This is translated by the device handler into a CAMAC F0A0 operation. It is possible that the CONTROL property may be split off of the present 032 module device entries in the database, and that new device entries for the data pooling application may be added to the database. Table 2 outlines the database 032 device changes envisioned. Note that in Table 2, the acronym SDP is used to refer to the datapooling program that will run on the front-ends. The three columns in CAPS correspond to device properties. Note that the SETTING property can be both set and read, with each operation resulting in different effects. See the EPICURE User's Guide by Bill Higgins for more details about device properties. Refer to Bill Higgins's SWIC scanner users manual for more information about CAMAC operations involving the 032 module.

386 Device Handler

A ``device'' handler resident on a '386 in the VME crate will redirect SWIC data reads to VME common memory. The handler will be accessed by the device names that users will call in order to access 032's in the field. But rather than going out to the CAMAC link, a user's request will be sent to the handler. The handler will then grab the data out of common memory and place it in the reply queue.

Common Memory

The datapooling application will receive a pointer to an address in VAX virtual memory that corresponds to the data pool by calling QVI_CS_POOLPOINTER(). The common memory datapool will be organized in the program as an array of structures, all of the same size. Each structure is as outlined in Appendix A. The size of each structure is 2092 bytes, which is aligned along long word boundaries for consistency with the service QVI_CS_VAX2CM(), which stuffs data into the data pool in common memory. The bread and butter of this datapooling software system will be a non-interactive application on one or more front-end node(s) which will attempt to read SWIC scanner profiles (of all SWIC scanning devices listed in its data file) from the 032 modules and to stuff them in a data pool in VME common memory where it can be accessed via non-destructive reads over DECnet with DAR calls. The application is temporary, and it will be designed to work without disrupting the SWIC scanner reboot and modify (control) program.

Function

The SWIC datapooling application will not attempt to synchronize its sets to the SWIC scanners with their programmed HOLD and CHARGE times for three reasons. First, because the SWICs can be timed externally as well as with software, there is no guarantee that the HOLD and CHARGE time will accurately predict when the SWIC scanners will be in the `holding' phase. Second, the status byte received back from the SWIC does not differentiate between `not ready' and some more serious type of error, so there is no way for the SWIC data pooling program to determine whether or not the SWIC scanner has crashed. Finally, the SWIC scanners return only vertical and horizontal gain information; no information about the internal CHARGE and HOLD settings can be obtained through an 032 read. However, this means only that we cannot obtain an accurate record of the time at which a particular SWIC scan was obtained by the scanner; as long as data resides in the SWIC scanners and they are in the proper mode (i.e: STORE vs. DISPLAY,) the data pooling application should be able to read the beam profiles from the 032.

System Initialization

When the front-end node boots, VME common memory is initialized and structures, including any defined datapools, are set up. The SWIC scanner data pool will be initialized with enough space allocated for storing information about 120 scanners (see Appendix A for the size of the SWIC scanner common memory data structure.) Another command in the EVERYBOOT job will start the SWIC datapooling service by installing the data pooling application as a single-image detached process. A logical name SDP$DEFAULT will be defined in the system symbol table. It will direct the detached datapooling process to a VMS text file where the program's input data resides. The following subsections describe the workings of the program from within its process.

Program Initialization

The first thing which the application will do at startup is to initialize the EPICURE service routines with calls to da_init() and ds_init(). The application will then look for a valid VMS file specification at with the system logical SDP$DEFAULT. If the program finds the input file, it loads its source data file, process it into records and fields, and closes the file. Verification will be made that the 032 device names in the input file exist and correspond to functioning devices, and the 032 devices will be split up into groups, where is the a parameter defined at compilation. The application then figures out how much memory it is going to need and calls QVI_CS_MAPCM() to map its common memory data pool in VAX virtual memory; it then checks the size of the pool to verify that there is enough space for all of the SWIC scanner data. The program requests a pointer to its proper data pool (whose number is found in the program's input text file) with a call to QVI_CS_POOLPOINTER(). The application sets up the proper data structures in common memory.

Then, the application begins building set lists and read lists for each group of 032 module device names it assembles from the input file. This process is repeated eleven times each beam spill, once for each of the 10 scans that a SWIC scanner might have defined in its internal memory, and once to read back calculated data. For a particular group of 032 modules, 4 EPICURE request lists will be created. The first is a SET list with the CONTROL property, used to clear the read and write FIFOs of the 032 module. The second is a SET list with the SETTING property, used to write data to the write FIFO of the 032 module. This list will be used to tell to SWIC scanner to begin depositing data into the read FIFO. The third is a READ list with the READING property, which will return a byte count for the read and write FIFOs. The fourth list is a READ list with the SETTING property, which will return the contents of the read FIFO. Table 3 illustrates the four different kinds of EPICURE request lists that will be used by the application. Note that the word STAGGERED under the field of FTD means that each group's read SETTING and read READING request lists have FTD's that are a few milliseconds after the previous group's request list FTDs.

Each request list (of which there will be 4*n, where n is the number of 032 groups) will be declared with a list ID that uniquely identifies the group of the 032 device and the data block (of which there will be 11) of this particular read or write operation. That ID will be the argument passed to an AST routine which will set a global flag indicating that a particular read or write operation has finished. That way, the sets and reads of SWIC data can be carried out asynchronously. The FTD of each read request list is after the FTD of previous group's list, so that the reads of 032 data are staggered during the beam spill. The reason why FTD's are set for read requests rather than using the FTD IMMEDIATE is that each read request of FTD IMMEDIATE requires a database access of between 400 and 500 bytes. For upwards of 1,320 reads of block data from the 032's each beam spill, that could become cumbersome for DAR. Using staggered FTD's, on the other hand, will allow DAR to maintain the proper database information in memory from cycle to cycle. If the database OA view is moved to the Front-Ends, then that consideration will no longer factor in the datapooling strategy. Once the FTD's and the request lists are built, and the read requests are sent to the DAE, then the program deletes the request lists and repeats the entire process for the next block of data to be pooled from the SWIC\ scanners. After all 11 data blocks have been pooled for all n groups, the application goes to sleep and wakes up at some time after T. The time at which the program wakes up is set in the data input file.

Getting SWIC scanner data

Once the program wakes up after T, it executes the following sequence of operations and then goes back to sleep again until the next beam spill.

Errors

Errors will be sent to a log file on the front-end since the program is non-interactive. The application will not terminate execution unless a fatal error occurs. Errors will also be sent to OPCOM by the VMS system service SYS$SNDOPR.

Possible Future Enhancements

It is possible that an application may be developed that will reside on user nodes and will read SWIC scanner data from VME common memory, perform data analysis (e.g: histograms, multiple-peak fits, etc.), and display SWIC profiles. Such an application would be developed as time and resources permit.

Appendix A

The first structure is SWICscan, which is the format of the data a user will get back from DAS if a data from a single scan is requested. The second structure is ALLscans, the format of data returned when a user requests data for all ten scans. The size of struct ALLscans is 2092 bytes, which divides into longwords for stuffing into common memory with QVI_CS_VAX2CM(). The #define macro and longword type definitions are for reference. Total common memory usage is expected to be 120 SWICs * 2092 bytes/SWIC = 251,040 bytes (not counting array descriptors, etc.), but this figure will be spread over three or more CAMAC front- ends. Note that the byte sizes of various types are VAX-specific. The structures will be placed in EPICURE_INC, and the common memory structures may be placed in EPICURE_SYSINC. Two include modules are listed in this appendix, the user include module SWIC_USER and the common memory include module SWIC_CM.

SWIC_USER.H

#ifndef  __SWIC_USER.H
#define  __SWIC_USER.H

/*+ ********************************************************************** ********************************************************************** **** **** **** **** **** EPICURE Beamline Control System **** **** Copyright (c) Fermilab, 1988 **** **** **** **** SWIC Datapooling Services **** **** Public include module SWIC_USER.H **** **** **** ********************************************************************** ********************************************************************** * * MODULE: SWIC_USER * * VERSION: 1-006 * * FACILITY: SWIC Datapooling Services (SWIC) * * ABSTRACT: Public include module defining data types and constants * needed to access SWIC data in VME Common Memory from * VAX C. * * ENVIRONMENT: VAX/VMS V5.1. Uses nonstandard VAX C data types. * * AUTHOR: S. A. Ramsey CREATION DATE: 15-Jul-89 * * MODIFICATION HISTORY: *============================================================================== * 1-001 15-Jun-89 SAR Created original. * 1-002 19-Jun-89 SAR Added status codes. * 1-003 21-Jun-89 SAR Broke up data into 11 DSP's + composite DSP * 1-004 30-Jul-89 SAR Added structure definitions for non-VAX C compilers * 1-005 24-Jul-89 SAR Added comments to preprocessor directives * 1-006 27-Jul-89 SAR Added formal EPICURE header comment block * * *============================================================================== -*/

/****************** * Include Files: * ******************/

#ifndef FTD_S_EVENT #include "EPICURE_INC:FTD.H" #endif /* not defined FTD_S_EVENT */

/****************** * #define macros * ******************/

#define SWIC_S_NAME 6 /* SWIC titles have a maximum length of 6 characters */ #define SWIC_C_TYPE_CALC 13 /* `Calculated data' type code returned by SWIC scanner */ #define SWIC_C_TYPE_1SCAN 14 /* Type code returned by scanner for data from one scan */ #define SWIC_C_TYPE_5SCANS 15 /* Type code returned by scanner for data from 5 scans */ #define SWIC_S_SCANS 10 /* Maximum number of scans that the scanner can have defined */ #define SWIC_S_WIRES 48 /* Number of A/D channels that the SWIC scanner can integrate */ #define SWIC_S_DIMS 2 /* Two dimensions of input--vertical and horizontal */ #define SWIC_M_VERTICAL 0 /* Array index of vertical data */ #define SWIC_M_HORIZONTAL 1 /* Array index of horizontal data */

/********************************* * Global definitions and types: * *********************************/ /******************************** * Local definitions and types: * ********************************/

#ifdef VAXC struct SWICscan { unsigned short byte_count; unsigned char data_type; char swic_status; char swic_name[SWIC_S_NAME]; unsigned char vert_gain; unsigned char vert_number; unsigned char horz_gain; unsigned char horz_number; variant_union { struct { unsigned short mean; unsigned short sigma; unsigned short area; unsigned short peak; } SWICcalc[SWIC_S_SCANS][SWIC_S_DIMS]; unsigned char SWICwires[SWIC_S_DIMS][SWIC_S_WIRES]; } x; }; #else struct SWICscan { unsigned short byte_count; unsigned char data_type; char swic_status; char swic_name[SWIC_S_NAME]; unsigned char vert_gain; unsigned char vert_number; unsigned char horz_gain; unsigned char horz_number; union { struct { unsigned short mean; unsigned short sigma; unsigned short area; unsigned short peak; } SWICcalc[SWIC_S_SCANS][SWIC_S_DIMS]; unsigned char SWICwires[SWIC_S_DIMS][SWIC_S_WIRES]; } scan_data; }; #endif /* defined VAXC */ struct ALLscans { unsigned scans_defined : 11; unsigned num_defined : 5; struct { unsigned short scan_data_length; unsigned : 16; unsigned long camac_read_status; struct TimeStamp timestamp; struct SWICscan data; } FATscan[SWIC_S_SCANS+1]; };

/******************* * Global Storage: * *******************/

/************************* * Module Local Storage: * *************************/

/********************************************** * Global Procedure and Function definitions: * **********************************************/

#endif /* not defined __SWIC_USER.H */

SWIC_CM.H

#ifndef __SWIC_CM.H
#define __SWIC_CM.H

/*+ ********************************************************************** ********************************************************************** **** **** **** **** **** EPICURE Beamline Control System **** **** Copyright (c) Fermilab, 1988 **** **** **** **** SWIC Datapooling Services **** **** System Include module SWIC_CM.H **** **** **** ********************************************************************** ********************************************************************** * * MODULE: SWIC_CM * * VERSION: 1-002 * * FACILITY: SWIC Datapooling Services (SWIC) * * ABSTRACT: Contains structure and constant definitions for * the DAE 032 handler to access SWIC data in VME Common * Memory. * * ENVIRONMENT: VAX/VMS V5.1 * * AUTHOR: S. A. Ramsey CREATION DATE: 27-Jul-89 * * MODIFICATION HISTORY: *============================================================================== * 1-001 27-Jul-89 SAR Created original * 1-002 25-Aug-89 TMW Added structures for DAE 386 compiler * * *============================================================================== -*/

/****************** * Include Files: * ******************/

#ifndef FTD_S_EVENT #include "EPICURE_INC:FTD.H" #endif /* not defined FTD_S_EVENT */

#ifndef __SWIC_USER.H #include "epicure_inc:swic_user.h" #endif /* not defined __SWIC_USER.H */

/****************** * #define macros * ******************/

#define SWIC_C_POOLNO 0 /* DataPool number in VME common memory */

/********************************* * Global definitions and types: * *********************************/

/******************************** * Local definitions and types: * ********************************/

#ifdef VAXC struct SWIC_DSP { unsigned short length; unsigned : 16; unsigned long sts; struct TimeStamp tstamp; struct ALLscans data; }; #else #define SWIC_KLUDGE 527

struct SWIC_DSP { unsigned long x[SWIC_KLUDGE]; }; #endif struct SWIC_CM_Block { unsigned long access_count; unsigned long write_count; unsigned long error_count; struct SWIC_DSP dsp; };

/******************* * Global Storage: * *******************/

/************************* * Module Local Storage: * *************************/

/********************************************** * Global Procedure and Function definitions: * **********************************************/ #endif /* not defined __SWIC_CM.H */

Keywords EPICURE, RDCS, controls, SWIC scanner, data pooling, C032 module. Distribution

normal.

Security, Privacy, Legal

rwest@fsus04.fnal.gov