Show Contents / Index / Search

The Working Directory

The Databridge Client's global working directory (also referred to as the service's working directory) is required for the Client to run. The working directory is different from the install directory, where the installer copies all of its files.

The first time you install the Databridge Client on UNIX, you create a working directory. In subsequent installations, you can reuse this directory. For instructions on creating this directory, see Install the Databridge Client on UNIX.

The first time you install the Databridge Client on Windows, the installer creates the working directory for you in the location you specify. This path is saved as the WORKINGDIR string value in the Windows registry key.

Contents of the Client Working Directory

The working directory contains four subdirectories: config, locks, logs, and scripts.

This subdirectory

Contains

config

The program-generated file (log.cfg) that keeps track of the current log file and the binary configuration file (dbcontrol.cfg), which the service updates when you make changes to settings in the Client Console.

If you export the service configuration file, the exported copy (dbcontrol.ini) is created in this subdirectory. When you add a data source, a backup copy of the service configuration file dbcontrol.cfg is created (dbcontrol.bak).

locks

Lock files for each data source. A lock file is created the first time a client run starts. The file is re-opened on subsequent runs and kept open for the duration of the run to prevent additional runs using the same data source. The -u option can no longer be used to unlock the data source in command-line runs.

The locks subdirectory is created the first time you start the service (new installations) or when you run the Migrate program (upgrades).

logs

Log files (*.log) created by the service. A section in the configuration file named [Log_File] allows the user to set the criteria for switching log files. For more information, see Appendix C in the Databridge Client Administrator's Guide.

scripts

Command files launched by the service under circumstances such as, when a client run terminates or the BCNOTIFY program on the MCP sends the service a request to run a script file.

This is a suitable location for saving Batch Console source files that contain the commands for the Batch Console.

Each Data Source Has a Working Directory

Each data source requires its own separate working directory within the Client working directory. This directory is created automatically when you add a new data source in the Client Console or when you upgrade existing data sources using the Migrate program. If you use the command-line client (dbutility), you must create directories for your data sources. When dbutility executes a configure or a define command, it creates the subdirectories (if they don't exist) and a binary configuration file (dbridge.cfg) in the config subdirectory.

The data source working directory contains the following subdirectories:

This subdirectory

Contains

config

The client configuration file (dbridge.cfg). For runs initiated by the Client service, this file must be binary. The command-line client can use a binary or a text file. For details, see Appendix C in the Databridge Client Administrator's Guide. Make sure that you delete any exported text configuration files (dbridge.ini) that are no longer used. These become obsolete when the configuration is updated from the Client Console.

The Null record file for the data source and two program-generated files (log.cfg and trace.cfg), which keep track of the current log and trace files.

Note: You can make the Client add signon information to the configuration files it creates by running the configure or a define commands with switches to specify your signon parameters. For more information, see Signon Configuration.

dbscripts

Files created by the generate command, SQL scripts, bulk loader files (*.ctl,*.fmt), and command files to invoke the bulk loader. In Oracle, the control files also specify parameters for SQL*Loader.

discards

Files created by the client or the bulk loader when they encounter a data record that cannot be processed. If the Client determines that a DMSII record is bad before handing it to the bulk loader, it puts this record in the client discard file as if it was an insert operation. The data is encoded as an SQL stored procedure call using the appropriate stored procedure based on the type of update.

Discard files can be executed using the database's query tool:

  • SQL Server Management Studio query window
  • Oracle SQL*Plus utility

logs

Client log files. The filename includes a user-configurable prefix followed by a date (yyyymmdd). If multiple log files exist for the same date, a time (_hhmmss) is appended to the end of the filename.

scripts

User script files.

The data source working directory also contains temporary files. Many of these files aren't removed by the Client and should be deleted periodically. You can safely delete them any time the Client is not running (except during the interval between a redefine command and a reorg command). Temporary files include: trace files; bcp log files; unload files; scripts that are executed by the redefine command; and temporary files used by the Windows client during data extraction. You should use the logmaint utility to get rid of old log files. This utility lets you specify the number of log files you want to keep at a given time.