PROJECTS

Application Modernization - NTT Data, Vancouver. 2015-2024
Multiple projects and technologies supporting NTT Data modernization engagements:

  • Data prepossessing tools: Working with a colleague, we developed a set of tools that automated the documentation and step-wise cleanup of foreign system code in preparation for loading it into the NTT Data Transformation Manager (TM). The tools were packaged as Ruby Gems and included in the production roll out of TM.
  • Unification of mainframe DDL export processing: Data Definition File export from mainframe systems can be supplied by a number of different programs. Their export formats vary, as do their content, depending on what flavor of SQL is being defined. Over he years, a number of scripts had been developed to handle these variations. I extracted the common properties among them and created a single script with export specific details in a definition file.
  • COBOL syntax decomposition: As part of the code processing, COBOL statement sequences were extracted and presented in different contexts. Some contexts focused on the details of logic, others on sequences of statements. I developed scripts that parsed the COBOL into the necessary snippets for further processing and layout.
  • Workflow Integration: Engagements inevitably create new scripts and update existing ones. The process of using the newest scripts in following engagements was mostly a manual process that involved importing them from whatever source was convenient for the developer. I worked with the product team to leverage our Git source control workflow into their TM release build so that the most recent scripts were always available in any engagement.

Volume Data Transform and Production Automation - Copyright Clearance Centre, Danvers Mass. 2013-2014
This contract involved analyzing the existing ETL process in light of significant volume increases that were expected (and that materialized). Markedly different data sources were arriving in legacy formats and in XML. Working with the design team, I developed code libraries for common transforms and established an SVN repository to house the programs, common modules and code snippets to be used in the production process. Additional data sources needed to be added mid-stream and these were easily incorporated. A Jenkins server was setup, and although it is normally used as a continuous integration server, I employed it for production work flow. It served this function very well providing automatic check-out of the the newest code, linked production steps and automated starting of production sequences. The system processed 70 million records in a month.

Infrastructure Development - EBSCO Publishing, Ipswich, Mass. 2003-2011
Multiple projects involving a number of technologies focused on improving EBSCO's infrastructure:

  • Prototyping new approaches to the production process using a Java client to transform the records and create the database. Client http requests were sent to a JBoss Application Server that was configured to use Cocoon pipelines. The pipelines took XML from an Oracle database, transformed it (after interacting with other network servers) and placed the results on a JMS queue. The client then used its own SAX parser to process batches from the queue. This unique use of Cocoon and JBoss showed promising increases in scalability and throughput.
  • In a variation prototype, we wrapped a legacy 'C' Library in JNI to allow the client to write additional XML records to a MySQL database. Development was on Windows and Linux using Eclipse. An SVN repository was constructed and Ant scripts employed to perform automated builds that created release packages.
  • Demonstrated the use of Schematron as a hand off tool for data arriving from vendors in order to confirm validity to the many schema changes as the project moved forward.
  • Improved and extended Ant build scripts for the automated build process to produce various release packages from applications and control files retrieved from SVN.

Internationalization - EBSCO Burnaby, B.C. 2002
The search software that EBSCO developed and used in its service was based entirely on ASCII and although it recognized HTML entities, searching of non-Latin languages required a new server. This was a research and development project involving a small team that I managed. It developed a prototype search server supporting Unicode and demonstrated improved search accuracy for English and multi-lingual queries.

System Upgrades - EBSCO Burnaby, B.C. 1994-1999
I managed the upgrade of production and delivery systems to 64-bit I/O addressing so that data bases over 4GB could be serviced under UNIX/Linux and Windows. I coordinated the code review and modifications required for Y2K support and ported production programs and search server code from Windows to SCO, HP, Solaris UNIX and then to Linux.

Hot Wire - Async Corporation, Toronto. 1981
The aim of this development was the creation of an electronic publishing system using Videotex. This R&D project captured business news stories from the Canadian Press Wire Service to generate a product mix that included a custom real-time newspaper delivery.

BC Sport Fishing Estimator - DPA Consulting, Vancouver. 1980-1981
This project resulted in the milestone "1980-81 Georgia Strait Sport Fishing Creel Survey" which provided statistical data or the Pearse Commission. I was Project Manager and programmer for the data processing aspects of the survey. The project involved obtaining data stratified by weekday/weekend, location and month. Creel counts of catch by species came in from over 50 sites through interviews with sport fishermen. Float plane overflights on randomly chosen days counted all sport fishing boats in Georgia Strait within two hours. Three planes were used with up to three counters so that error bounds could be calculated. By linking the overflight counts with the landing interview counts, a very accurate estimate of the catch was determined for 1981. Over 50,000 interviews were processed (up to 3000 interviews per week). Most of the data manipulation and preliminary statistics were produced on a Cyber-70. The final estimated catch by species was generated by a custom FORTRAN program on a PDP-11.

Weights and Balance Project - Sunstrand Corp., Bellingham, Wa. 1979
This contract involved implementing the user interface for the PC600 Advisory Display. It used a state machine model implemented in 8080 assembly language. It was tested using Intel's In-Circuit Emulator and an airplane simulator.