1068 Junction Drive, Manteca, CA 95337
Office Phone: (209) 823-4152, Home Phone: (209) 823-2501, Fax: (209) 823-4152
I have over 30 years of experience in the field of IT, nearly all of it as a consultant. I have worked with a wide variety of hardware and software, and with many different types of applications (see details below). I have managed large projects for Fortune 500 companies, and have also developed many small to mid-sized systems. I have been responsible for the entire software development cycle on numerous projects. I have extensive experience as a lead Systems Architect, lead Data Architect (Data Modeling) and Applications Programmer. I’ve provided support and enhancements to very large systems that were originally written by teams of other programmers. I excel at working with other's code. I have a very strong background in manufacturing applications (production scheduling, inventory control, requirements planning, product mix optimization, etc.), mathematical modeling, algorithms, statistics, financial modeling systems,
and formal software testing.
I work very well with people, and have extensive experience in interfacing with all levels of management. I'm an incredibly quick study and a strong team player. I have excellent verbal and written communication skills. I'm a good listener, and an excellent problem-solver. I'm conscientious and focused.
I'm energetic, positive, creative, tenacious and self-driven.
For the past 16 years I have been working with Client/Server and 3-tier systems employing GUI and Web-based front-ends and RDBMS back-ends. I am extremely strong in MS SQL Server and Sybase Transact SQL, with extensive experience in system and data design and in writing and optimizing Stored Procedures, Triggers, DDL (DBA functions) and DML.
I can work with either heavyweight methodologies (e.g. UML) or Agile methodologies. I know how to combine the best of a variety of approaches, choosing the right tool(s) for each task or sub-task. My personal preference is an Agile/Evolutionary approach, with short cycles and lots of User interaction and feedback. I’m a huge proponent of thorough documentation from the requirements definition phase through specifications, inline coding, change documentation and User-level documentation.
The remainder of this section presents summary-level information about my experience and education. The following sections present a list of in-house hardware and software tools and detailed work experience information, in reverse chronological order.
References are gladly furnished upon request.
The following is a link to my Public LinkedIn Profile: http://www.linkedin.com/in/AlShermanTNH
If you have a LinkedIn account, then you can access my LinkedIn Recommendations via the above link, by clicking the View Full Profile button on my Public Profile.
- Nondual philosophies. E.g. the study and practice of Buddhism, lojong, the writings of authors such as Thich Nhat Hanh, Ringu Tulku, Pema Chodron, …
- Strength training and fitness in general. I’m always excited to share this sort of info – ask if you are interested! Kettlebells, bodyweight exercises, dynamic-tension, active flexibility, …
- IBM PCs and compatibles, 24 years
- UNIVAC 90/30 and 90/60-80, 9 years
- HP 3000, both Classic (16 bit, MPE) and PA-RISC, 4 years
Languages and other development tools:
Languages used most recently are listed first. I've been using Object-Oriented (OO) techniques and a variety of OO tools for several years.
- Microsoft Transact-SQL under Microsoft SQL Server 7.0, SQL Server 2000, SQL Server 2005 and (just a bit of) SQL Server 2008 . Microsoft Transact-SQL "began-as" and is still very much like Sybase Transact-SQL, so the years of Sybase SQL experience and Microsoft SQL are really pretty much a continuous use of the same language, commonly referred to as T-SQL. Altogether, I have over 16 years of combined experience using Microsoft SQL Server and Sybase Transact-SQL. I have significant experience with DTS. I have extensive experience using SQL Server Enterprise Manager and some experience using SQL Server Management Studio under both SQL Server 2005 and SQL Server 2008.
- Perl (9+ years). UDP messaging (sockets, etc.), database access via ODBC and DBI, CGI, Oracle Perl Cartridge, …
- WinIDAMS, a statistical package. Worked with this package for a few months while developing Forecasting equations for The Fuel Web.
- Modeling Tools. I have experience with Visio, Sybase PowerDesigner and Enterprise Architect. These tools provide support for UML and other types of Data/System Modeling. I’ve typically used these tools for Data Modeling, Sequence Diagrams and Flowcharts.
- Sybase Transact-SQL (8 years), Sybase 4.x through Sybase System 11
- Clarion (3 years), including Clarion for DOS, and a bit of Clarion for Windows.
- Oracle 8i Enterprise Edition (OAS, OEM, PL/SQL, SQL*Plus) and a variety of related third-party tools (14 months)
- PowerBuilder 3.0 through 6.5 (7 years), including extensive work with PowerClass, an OO Framework
- Pascal (7 years)
- FORTRAN (6 years)
- SAS, BMPD (statistical packages) – quite a bit of experience, albeit quite a few years back
- I have a basic familiarity with a number of other languages and technologies. I’m not current in these skills, but have worked with each to some extent, some a couple of years: C, C++, Smalltalk, XML, Java and J2EE technologies, Samba, Apache, Dreamweaver, Homesite, Paradox/PAL, SPL, and some of the rudimentaries of Data Warehousing
I've assembled a great "tool set" to facilitate a high-productivity development.
See Hardware and Software Tools below.
Microsoft Windows XP Professional, Microsoft Windows 2003 Server (including Clustered systems), Microsoft Windows 2000 (both Professional and Server), Microsoft Windows NT (3.51 thru 4.0), Microsoft Windows 95, Microsoft Windows 3.x, MS-DOS, UNIX (Redhat Linux 6.2 and Solaris 8), OS/2, MPE, MPE/XL, VAX DCL
MS LAN Manager, MS Windows for Workgroups and WinNT networks, LANtastic, DecNet, Novell, Samba
- M.A. in Operations Research, Yale University, 1973.
- B.A. in Mathematics, Cum Laude, CSU Stanislaus, 1971. Minor in Business Administration.
Click here to skip over the Hardware and Software Tools, and go directly to my detailed Work Experience.
Hardware and Software Tools
I typically go on-site during the initial phases of a project and "as-needed" thereafter. I perform the vast majority of my work via telecommuting - communicating with my Clients via phone, fax, Instant Messages, Remote Control tools such as GoToMeeting and LogMeIn, and email. I connect to my Client sites via a variety of methods, e.g. PcAnywhere, dialup access, a Virtual Private Network (over the Internet), Virtual Network Computing (VNC), NetMeeting, Remote Desktop Connection, etc. My hardware and software tools are tailored to provide a premier software development and telecommuting environment. I've been providing remote Client support since 1988.
Please let me know if you'd like to see more information on my hardware and software tools.
A partial list of my current hardware and backup environment is as follows:
- Several high-end networked PCs with plenty of processing power, memory, storage capacity, etc.
- 2 types of Backups are performed daily:
- Backups to an external (but local) hard drive
- Off-site Backups of selected files (all truly critical files - source code, documentation, task notes, etc.) to iBackup.com
- In order to provide Disaster Recovery capabilities, I maintain a live offsite warm standby Virtualized environment, using VMware on both the Production and Standby systems. This, when combined with my daily off-site Backups to iBackup.com, insures that I can be back up-and-running in a minimal amount of time in the event of a total system meltdown, physical facility disaster, etc.
- Top-of-the-line surge-suppression, power-conditioning and Uninterruptable Power Supply (UPS) equipment.
- High-speed Internet connection with firewall (of course) and layers of Anti-Virus and other software tools to prevent system corruption,
including Diskeeper on all systems.
A partial list of my software tools consists of (Note: Most of these are installed on my current systems. Some of these have been taken offline, but could be easily resurrected.):
- MS SQL Server 2000, 2005 and 2008.
- SQL Server tools provided by Idera Corp.
- SQL Diagnostic Manager. A comprehensive, integrated suite of tools for monitoring and performance-tuning SQL Server.
- SQL Change Manager. A tool to manage and monitor schema changes.
- SQL Admin Toolset. 24 tools for monitoring, troubleshooting, administering and reporting on an organizations multiple SQL Servers.
- SQL Defrag Manager. An application which automates & optimizes database defragmentation.
- Extensive custom-built auto-expanding parameter-driven SQL code templates. These are astounding productivity-boosters.
- Huge libraries of SQL utility procedures and functions.
- Oracle and related third-party tools:
- Oracle 8i Enterprise Edition, which includes Oracle 8i Server and a host of other tools which provide support for server-side Java (EJB, Servlets), Perl, SSIs, CORBA, etc.
- TOAD, a PL/SQL editor
- PL/Formatter, a PL/SQL code formatter
- Oracle Application Server 188.8.131.52
- Oracle Enterprise Manager 2.0.4
- Jdeveloper and other Oracle tools
- Perl and related tools:
- Numerous auto-expanding parameter-driven Perl code templates, including many that I have "custom built". These are astounding productivity-enhancers.
- The ActiveState Perl Development Kit and Debugger
- The Oracle Application Server (OAS) Perl Cartridge and the Perl interpreter which is distributed with OAS.
- Internal web site links to extensive online Perl resources
- Huge libraries of working Perl code, including many generalized utilities.
- Multi-Edit 8.0, a truly amazing programmer's editor. Multi-Edit supports User-definable auto-expanding parameter-driven code templates. I have built a huge library of auto-expanding, parameter-driven code templates for a variety of languages, especially T-SQL and Perl. These templates result in ultra-fast code development and super-clean code, and facilitate the extensive in-line documentation that is one of the hallmarks of my work.
- Visio - used for Data Modeling, Flowcharts and UML diagramming – e.g. Sequence Diagrams.
- Sybase PowerDesigner 9. An enterprise-class modeling package. Supports Data Modeling, forward and reverse database engineering, UML, code-generation and a host of other features. Creates database diagrams than Visio (which is know to miss-represent data-types).
- Enterprise Architect UML tool. An inexpensive, but very nice tool for creating UML diagrams.
- Powerbuilder 6.5
- Ecco Pro, the "World's Greatest PIM". I use Ecco Pro to organize all of my work. Great tool!
- TLIB Version Control System. Supports Check In and Check Out, version extractions and compares, etc.
- Crystal Info and Crystal Reports. I've worked with these tools extensively.
- Microsoft Office products: Word, Excel, etc. I work with these on a regular basis.
- WinBatch and WinMacro. These provide a full macro programming language for the Windows platform.
- Many other tools and a great reference library, including both hard-copy and CD-based resources, e.g. several of the O’Reilly "CD Bookshelf" series.
2/2005 thru present:
Database Architect (shared with others), Database Administrator (shared with others), and Database/Perl Developer with Programming Labs.
The following discussion of the tasks performed to date for Programming Labs provides only a high-level overview. Please let me know if you would like further details. The sections below which discuss my previous contracts provide a much more detailed discussion of the types of optimization processes and complex algorithms which I’ve implemented. Please read these later sections if you want to get a feel for the types of capabilities I bring to the table in these regards.
This contract has involved the following types of tasks:
This system uses a SQL Server 2000 database, in a Clustered environment.
- Stored Procedure and Trigger design, coding and performance optimization. Extensive use of system tables and Dynamic SQL to implement complex generalized Data Driven processes, as well as the development of many other complex stored procs. This has been the primary focus of my work for Programming Labs.
- Database modeling and design, in conjunction with other senior team members.
- Query optimizations to speed ad-hoc queries generated from the ASP code.
- Perl coding – creation of complex routines for import-data preprocessing.
- Creation and maintenance of Database Jobs.
- Creation of numerous DTS Packages.
- Creation of Reporting Services reports, including many complex formats.
- Design recommendations such as using Perl to code the Imports Pre-processor executable.
- Recommendation, installation, configuration and maintenance of various tools such as disk defragmentation tools, database backup and other support applications, etc.
- System troubleshooting, such as resolving a Fatal Exception Error in conjunction with Microsoft support. This error was being caused by address-violation issues related to a third-party product which had been implemented "before my time".
- Resolution of other types of performance-related problems such as Blocking and Automatic Recompile problems.
On this contract, I have worked primarily with Mr. Alec Sherman, President of Programming Labs and lead project developer, and Mr. Paul Jackson, President of Eutactics, Inc., the primary Programming Labs Client for this contract, and the driving force behind the high-level design and strategic decisions. References will be gladly furnished on request, and/or you may see Paul's recommendation via my LinkedIn Profile at http://www.linkedin.com/in/AlShermanTNH by clicking the View Full Profile link.
My initial task for Programming Labs was to resolve website performance problems, and I’ve continued to be involved in the process of monitoring and resolving performance issues as they are encountered. I use a variety of tools to this end, including:
- Idera’s SQL Diagnostic Manager, SQL Admin Toolset and SQL Defrag Manager
- Microsoft Profiler
- Custom-built stored procs which provide Blocking info and store the results in Snapshots for later review
- Custom-built stored procs which provide Procedure Trace info, enabling the detection of Uncompleted Processes, and facilitating post-processing diagnostic analysis and performance summary info.
8/30/01 thru 2/2005:
Chief Database Architect, Database Administrator, Systems Analyst, Software Architect and Developer with The Fuel Web.
We had a "lean-and-mean" team on this contract, and I performed all of the above functions on a regular basis. I have great breadth and depth. Ask my references!
The Fuel Web provides a Web-based service that combines state-of-the-art hardware monitoring of propane tank fuel levels and related information (ambient temperatures, etc.) with leading-edge Fuel Usage Forecasting, Delivery Scheduling, Alert Notifications and many other features.
The working environment is a 3-tier application, with a MS SQL Server 2000 database, and Web-based clients. Using proprietary hardware and software technology, Propane Tank data is gathered from user tanks, forwarded to the database (in-processing via Perl), analyzed, used for forecasts, presented on the Website, etc.
When I joined the Fuel Web team in August of 2001 the product was in an early-beta stage. I reported to and worked extensively with The Fuel Web president, Mr. Tom Walker (reference furnished upon request). I also recruited the other database programmer and the Website developer who were subsequently brought into the team.
Broadly stated, my responsibilities were to interact with Tom to produce a system that will realize his visions. This included extensive interaction re Requirements Definitions, and a team approach to the design and implementation of the data architecture and algorithms necessary to realize Tom’s objectives.
My major responsibilities were:
- Chief Data Architect and DBA.
Data Modeling. Design, creation and maintenance of all tables, keys, indexes, and preparation of all related documentation.
- System Analysis and Design.
Worked closely with Tom to determine/formalize the system needs, and the high-level approaches to be taken. Documented the Functional Requirements Specifications to be used in the Software Design and Implementation process. Occasionally used UML Sequence Diagrams and Use Cases.
- Software Design, Implementation and documentation of the following types of code, to meet the Functional Requirements Specifications:
- SQL Stored Procedures and Triggers
- Perl routines
- Statistical analysis and derivation of various Forecasting functions.
- Managed/interfaced with a Senior Database Developer and a Web Programmer as needed. Delegation of database DDL tasks when possible - this was a Mentoring type of activity - the Senior Database Developer was able to assume the role of an intermediate-level Data Architect during the course of the project.
The following is a high-level summary of some of the tasks that I performed for The Fuel Web:
- Re-architect the database to Normalize the data structures. The initial architecture reflected the structure of the data as collected in the field units, and contained "repeated groups" in some of the critical tables, as well as other opportunities for improvements.
- Design the Data and Systems Architecture (in conjunction with Tom, who was the primary Systems Analyst). This was an ongoing process during the course of the engagement as new system features and capabilities were added. This was done via extensive interaction with Tom. Typically, we would discuss the system needs, and then I would write up functional specifications to insure that the requirements were well-defined. Once the requirements were nailed down, I would write up the proposed data and systems architecture, and get approval from Tom before implementation. Tom and I worked closely together to evaluate alternative design approaches. Over time, we got to the point where I would simply "proceed to implementation" for many of the simpler modifications for which there were no "business-rule" types of issues that needed Tom’s input. Extensive Data Modeling and Systems Modeling.
- Perform extensive trouble-shooting, error-trapping and code revisions for the existing Bulk Import procedure. Implement Transaction Processing within the Bulk Import procedure, with extensive error reporting and Procedure Trace logging, since there were many ways in which data errors could cause the existing import routines to crash. Determine problem causes and implement solutions.
- Implement modifications to populate the Normalized structures during the Bulk Import process.
- Propose, design and implement infrastructure utilities. E.g. a subsystem for logging Procedure Trace information, allowing for the collection and presentation of detailed procedure/step timing information, error logging for exception conditions, etc. This formed the basis for the subsequently implemented Notification subsystem discussed below.
- Work closely with all team members to create a Notification subsystem that provides automated Notification to appropriate personnel upon the occurrence of targeted events. The Notification subsystem is extremely flexible, allowing multiple Notification modes (e.g. Email, Pager, etc.), selective delivery by message type, flexible ways of defining delivery groups, periodic Notification resends, and many other features. The Notification subsystem was one of Tom’s many original concepts, and we worked closely together to hammer out the design specifics. After laying out the data architecture and design specifics, I delegated the implementation to other team members.
- Create numerous stored procedures to provide various reports. Some of these were presented on the Website, and some were used internally for diagnostic purposes. Many bells and whistles – e.g. User-specified report "aggregation intervals" for data plots allowing the User to see the data as originally captured from the field, or aggregated to any desired time-interval, such as Daily, Weekly, etc. Design and create procedures to provide "seemless" integration of Historical and Forecast data for presentation purposes. Many report "selector options" provided – many levels of aggregation across several different dimensions.
- Design and implement numerous Triggers. Since much of the Exception Notification and setting of various system flags and computed columns needed to occur in the Trigger code that was fired during the data import process, the Trigger code was an extremely critical portion of the overall system, and it was often necessarily quite complex. Performance was a paramount concern, since we needed the system to provide scalability.
- Create stored procedures to be used when field hardware was reassigned/moved. Perform Referential Integrity checks and "relink" all the appropriate columns, as appropriate.
- Design and implement Perl scripts to import weather data from a commercial data provider. The Weather Data was imported to the database via ODBC. Import both historical data (for statistical analysis and plots) and forecast data (to be used in Fuel Usage projections).
- Perform statistical analysis of the data, derive Forecasting functions, and implement these functions in the SQL code. WinIDAMS was the statistical analysis software that was used in these analyses. This task involved positing different predictive models, performing comparative analyses using Stepwise Regression, evaluation of results, elimination of outliers, and the final determination of the preferred models.
- Design and create code to determine Fill Requirements for all tanks based on various forecasts, existing tank fuel levels, etc.
- Replace the Bulk Import process with a "real-time" approach that makes ODBC stored procedure calls on a packet-by-packet basis from the Perl script that receives the data from the field units. Starting from the existing Perl code, I implemented performance improvements and added extensive error processing (e.g. to trap for incoming data from different units, perform Update transmission retries, etc.). Logged all errors and timing information to the database ProcedureTrace facility. This script uses UDP messaging to communicate with the field units, and ODBC to communicate with the database. Provided many diagnostic outputs, summary statistics, etc. I’m big on providing tons of inline comments and I always provide each routine with a "debug" parameter that produces massive amounts of inline diagnostic outputs when enabled. Basically, I test each "code branch" as I write it, with diagnostic outputs to confirm the correct operation of the code "just added".
- Modify the Perl UDP messaging script to provide the ability to update the field hardware from information in the database. Make appropriate database revisions, etc.
- Perform performance tuning. Implement performance improvements by using the execution timing information provided by the Procedure Trace mechanism to identify and implement the selective addition of indexes and a few carefully considered "performance denormalizations".
- Write-up Requirements Specifications and proposed solutions for all tasks undertaken
- Create numerous database diagrams, Sequence diagrams, Use Cases and flowcharts using Visio during the course of the engagement. These were used both for the purpose of documenting existing code/processes, and also for the purpose of explicating design suggestions, functional requirements, etc.
- Create extensive inline and formal written documentation of the various procedures created. UML Sequence Diagrams, Use Cases, Flowcharts – supplemented with tons of narrative commentaries, discussions of fine-points, etc.
- Manage a senior database programmer and interface with the web developer.
- Additional self-contained project to provide a Commission Report for one of the principal investors. This project involved significant DTS work – migration of data from multiple legacy applications to a new "combined" database.
- Many, many other tasks – too many to discuss here – ask if you’d like further information…
10/23/00 to 8/3/01:
Senior Database Architect, analyst and developer with Front Porch.
Front Porch was an Internet startup that was headquartered in Standard, CA. with offices in San Diego and other international locations. They are now defunct.
The working environment was a 3-tier application, with a MS SQL Server 7.0 database, and Web-base
d clients. Using proprietary hardware and software technology, Subscriber data was gathered from ISP partners worldwide.
In this position I was responsible for requirements definition, analysis, systems and data architecture, implementation, documentation and training. I reported to several different managers while at Front Porch and supported multiple departments and Users within the organization. I was also involved in some of the DBA tasks (there were several programmers performing DBA work).
My primary accomplishments at Front Porch were:
- Designed and implemented the core system functionality as a set of Stored Procedures that drive Front Porch’s back-end system. I defined the overall design to be used and the required data architecture, and also performed the detailed implementation:
- Subscriber Profiling Algorithms (Category Interest Preferences derived from site visits)
- Subscriber Ad Viewing projections
- Subscriber Eligibility calculations by Category, based on Subscriber Profiles and parameter-driven "eligibility" business rules.
- Order Allocation to qualified Subscribers. Allocations are made "by week" over the duration of each Campaign. AdType, multiple Category specifications, multiple Location specifications and Campaign priority are all taken into account during the Order Allocation process. Inventory is maintained at the Subscriber level.
- Introduced UML diagramming techniques to the Front Porch organization and conducted a critical design session using UML Sequence Diagrams to define and clarify the collaboration of several system components over time. Visio and Enterprise Architect were used to produce the diagrams that were distributed to the concerned parties.
- Performed extensive Stored Procedure performance optimization. Extensive execution plan analysis, runtime "execution statistics" analysis, addition of appropriate indexes to provide optimal performance, etc.
- Performed a variety of DBA tasks, including:
- Wrote DTS scripts to automate data transfer between databases
- Monitored system usage and performed performance tuning as required
- Wrote DDL to create tables, indexes, etc.
- Defined User Roles, GRANTed privileges to tables and Stored Procedures as appropriate
- Installed and administered a Crystal Info server.
- Created numerous Crystal Reports involving drill-downs and numerous advanced capabilities.
9/15/94 to 8/31/00:
Consultant to Tri Valley Growers (TVG), in Modesto and San Ramon, CA.
During this contract I worked on a number of moderately large systems which were well along in the development process when I joined the project team. I was the primary IT support resource for the Inventory and Scheduling department. I performed the data architecture and system implementation required to solve many complex scheduling and inventory problems.
My primary IT management interface was Mr. Steve Fleury, the Distribution Systems Manager.
My primary User management interface was Mr. Jim Fisher, the Manager of Inventory and Scheduling.
The working environment was a Client/Server application, with a SYBASE 4.x Server and a PowerBuilder client, connected via DecNet. Through the course of the project the database server evolved to SYBASE System 11. Oracle 8i was also used extensively over the last several months of the contract. Perl was used to create a variety of scripts.
My primary accomplishments while at Tri Valley Growers were:
- Performed extensive Stored Procedure Optimization.
In response to performance problems in existing Stored Procedures and PowerBuilder Selects, I became deeply involved in the inner workings of the SYBASE Optimizer. In many instances the SYBASE Optimizer generates counterintuitive Execution Plans, which can have disastrous performance effects. Working closely with the resident TVG SYBASE guru, I learned many of the undocumented "tricks" which may be required to force the Optimizer to perform as desired. Additionally, I thoroughly analyzed the text of several of the leading books which deal with SYBASE Optimization. Through the judicious restructuring of Stored Procedures, and the addition of a few well chosen "covering" indexes, I was able to replace Table Scans of large tables, which were generating Logical Read counts in the 100K range with indexed accesses requiring between 10 and 4K Logical Reads. The performance improvements were outstanding.
I also defined a general approach to writing "cost-efficient" Stored Procedures.
These Performance Optimization notes were documented using Doc-2-Help and presented with examples to our programming group.
The Execution Plan/Statistics analyses were performed from Windows using RapidSQL, a product offered by Embarcadero Software. These results were also frequently analyzed from output to a standard ISQL output file.
Database Administration functions were executed (when required, e.g. to add test indexes) using DBArtisan from Embarcadero Software.
- Designed and implemented a new approach to the Stock Allocation algorithm.
The Stock Allocation algorithm was a real-time Production Scheduling and Inventory Allocation algorithm that was implemented as a set of Stored Procedures.
This algorithm was called to allocate stock and make preliminary labeling and shipping assignments for Sales Orders as they were entered.
It was not feasible to put the existing algorithm into production because there was not enough time to process all the Sales Orders entered, given the time it took for the existing algorithm to execute. A 4-fold performance was required.
This was considered to be a mission-critical problem.
I was invited to attend a meeting of 10 principal members of the MIS group in San Francisco. This group consisted of high-level professionals who were much more familiar with the operational problem than I was, and included the designers of the original code.
The solution which I proposed was adopted in preference to all other proposals made.
I wrote up the formal specifications, and completely coded and tested the solution in 3.5 weeks.
A 10-fold performance improvement resulted. The original approach had taken many man-months to develop. The basic reason for the improvement was the transition from an "explicit enumeration" approach to a "heuristic" approach. The heuristic approach generated solutions which were identical to the "explicit enumeration" approach in almost all instances, and where the two differed, there was no clear advantage to either one - i.e. the resultant solutions were essentially "cost equivalent".
- Assisted in the conversion of data in a variety of formats to "SAP-formatted" flat files.
This task involved the conversion of data from various flat-files, Excel spreadsheets, and Sybase databases.
The approach used was to "pipe" all of the data into a common Oracle 8i database and then to generate the required flat files and reports from the Oracle 8i database. ODBC and PowerBuilder "pipelines" were used to get the data from the various data sources into the Oracle 8i database.
- Fixed existing system bugs and implemented system enhancements.
Used internal bug-tracking system to log fixes to existing bugs as assigned by the project manager. Performed many system enhancements, especially involving the Stock Allocation and Scheduling Stored Procedures. Interfaced with Users to define enhancement requirements and specifications. Wrote formal requirements/specifications documentation and reviewed/refined with IT management and User groups.
- Designed and implemented system reports.
The reports were coded in PowerBuilder, Crystal Info, and Infomaker.
- Performed extensive system design and wrote SQL scripts, Stored Procedures, Triggers and User Interfaces for a wide variety of purposes. This was my primary function at TVG.
- Wrote Perl scripts to automate SQL script generation and processing.
- Created general purpose Stored Procedures (various SQL "utilities").
- Created general purpose PowerBuilder Functions.
- Modified SYBASE 4.x code to compile under SYBASE System 10.
- Redesigned existing screens to provide an improved GUI design.
Various other In-House work not mentioned elsewhere:
Installed and configured all of the hardware and software packages mentioned in the Hardware and Software Tools section above. Administered all Operating Systems, networks and database servers (Oracle 8i, Sybase). Administered Oracle Application Server and Oracle Enterprise Manager. Set up a Samba server and client to provide connectivity between my NT and Linux systems. Configured Apache to provide Perl support on my Linux system.
Developed in-house code templates using Dreamweaver, HTML, Oracle Application Server, Perl, and Oracle 8i.
Pre-1994 Work Experience
I have a "Word97" formatted document which provides a much greater level of detail on my "pre-1994" work experience. If you'd like to see more details than what's presented below, simply call me at 209 823-4152 and I'll gladly either fax or email you an "expanded" version.
5/92 to 4/93 and 9/93 to 9/94:
Consultant to General Mills, Inc., in Lodi, CA. Implemented one system using PowerBuilder 3.0 and PowerClass, and two systems using Clarion and a variety of Clarion 3rd party products. MS SQL Server 4.2 (Sybase 4.2) was the database. Managed one other senior programmer. Converted the PowerBuilder system to ObjectView. I was responsible for full system development, from initial user meetings through documentation, training, and support. My primary interface was Mr. Tom Hauan, the Lodi Plant Computer Services Manager.
Extensive work with Stored Procedures, Cursors, Triggers, DDL, Windows API calls. Installed Sybase SQL Server on in-house systems.
4/93 to 9/93:
Consultant to Sierra Pacific Power, in Reno, NV. Performed bug fixes and implemented new functionality for an existing Clarion-based financial planning system. I've lost touch with my reference on this contract.
8/88 to 5/92:
As owner of Eucalyptus Software, I managed a six man programming shop during this time. I was heavily involved in the Turbo Pascal, Paradox and Clarion projects undertaken during this period.
Consultant to Applied Materials, Inc., in Santa Clara, CA.
- Developed a sophisticated multi-user Forecasting and Inventory Control system, written in Turbo Pascal. Many 3rd party tools were used on this system - Turbo and Object Professional, B-tree Filer, Blaise Power Tools, Turbo Plus and more. Tlib was used for version control. Many advanced features such as:
- record-by-record data compression using variant of the LZW compression algorithm (66% disk space savings, with only 10% speed degradation)
- context-sensitive Hypertext on-line Help
- many other features – let me known if you’d like details.
- Created a Paradox/PAL financial reporting application.
10/87 to 8/88:
Software test consultant to Hewlett-Packard (CSY/ADTL) in Cupertino, CA. Development of regression tests for PANDORA.
- Use of internal test automator to establish a regressable test suite, enabling quick test result analysis (exception reporting) to insure that new bugs are not introduced through ongoing code changes.
- Use of internal Path Flow Analyzer for Path Flow Coverage reporting.
- Close working relationship with project engineers for test development and bug resolution.
- Statistical analysis of potential performance improvements. Results impacted product development decisions.
This project made use of Pascal, SEGMENTER, the AUTOMAN testing environment, HP's test automator and Path Flow Analyzer, and many other HP3000 tools such as QEDIT, DEBUG, MPEX, STARS, and XEDIT, and was implemented under the MPE V operating system.
3/86 to 10/87:
Software test consultant to Hewlett-Packard (CSY/SWQE) in Cupertino, CA. Development of regression tests for operating system intrinsics and system kernel of MPE XL.
- Use of internal test automator, as above.
- Other details omitted due to lack of space.
These projects made use of SPL, Pascal, Pascal/XL, and a broad range of HP3000 tools, including Process Handling, Dynamic Loading, Traps, Interprocess Communications, Resource Management, Privileged Mode, and mixed mode programming (Compatibility Mode <---> Native Mode). Extensive use of system intrinsics was made.
All tests were implemented on the HP3000 PA-RISC machine.
9/85 to 3/86:
Consultant to Gallo Wineries in Modesto, CA. Performed statistical analysis, as follows:
- Used SAS and SAS/GRAPH to analyze and present data from the NIELSEN scantrack service.
- Drew conclusions and presented written reports to upper management.
- Multi-colored side-by-side bar charts were generated on the IBM 3287 plotter for use by Marketing.
This work was performed on the IBM 4341, under CMS. In addition to using SAS and SAS/GRAPH, a minor amount of interface with DYNAPLAN and FOCUS was performed.
1/74 to 9/85:
Operations Research Consultant and Senior Programmer/Analyst for Foster Poultry Farms in Livingston, CA.
This environment supported the most technically advanced work of any environment in which I've ever worked.
The organization had highly sophisticated personnel who had deep understandings of Operations Research concepts,
statistical procedures, etc. Additionally, the underlying nature of the business was a rich source
of optimization opportunities.
My primary contact was Mr. Loy Gould, and this, like almost all of my others, was a great working relationship.
Summary of responsibilities:
- Responsible for all phases of systems development - conceptualization, design, coding, implementation, user training, documentation, and maintenance.
- Management of in-house personnel.
- Statistical analysis of data and integration of attained functions into computer models.
While at Foster Farms, I designed and developed the following systems. Except for a few "one-time" systems (e.g. the Financial Planning program used in bank negotiations) these systems were used on a regular basis to provide critical operational and strategic planning information. Several of these systems were ported through major hardware platform migrations.
- A Feeding Strategy system to optimize nutrient levels given current ingredient and end-product prices, incorporating statistically derived weight and feed consumption response functions.
- A Product Mix Optimization system. This system used Linear Programming to provide optimal production quantities and ingredient usages, subject to current pricing, resource availabilities and demand constraints. Extensive "post-optimal sensitivity analysis" was provided. This system incorporated numerous innovative LP techniques.
- A Financial Planning program which solved simultaneous equations to project several years' activity. Pivotal in bank negotiations.
- A Plant Personnel Scheduling system to satisfy staffing requirements over 3 shifts, subject to shift length and days off requirements. Minimization of total straight time, overtime and weekend payroll.
- A Feedmill Scheduling system implementing complex delivery trigger points, parameterized feed consumption and mortality curves, multiple feed types and ranches.
- A system to apply the California Dept. of Weights and Measures tare compliance program to internally sampled data, with corresponding compliance percentages, etc. This system enabled the identification of seriously over-tared products.
- An Analysis of Variance system to provide reports for experimental data.
- Several LOTUS applications, involving heavy use of macros.
- Many other applications, typically heavy on the Math Modeling, simulation techniques, etc.
The mainframe projects implemented above were performed on the UNIVAC 90/30 and 90/60-80. The micro projects were performed on an IBM PC.
Languages used during this period were: Pascal, FORTRAN, C, LOTUS, BMDP, and SAS.