jeffdoyle
Author Archives: jeffdoyle

Installation Hand Off to the Client

As an integrator there are many things we can do as far as making the install a success, but only a very limited amount we can do to guarantee its continuing success. One of the most often seen issues is the handoff of an installation to the client, in particular the maintenance of the database.

Continue reading

It’s All About the Database

The key component to any content management/archive/workflow system is the database. Many times this key component is overlooked at the time a new imaging system is put in place, often the database is placed on an existing server which already has many critical duties or it is a new install placed on the same server as the imaging system software. These choices might be fine for a proof of concept, but as a system takes on more data and more users the database becomes the performance bottle neck.

As with any database, issues will rarely arise within its first year of usage, the data is slow to grow and performance of queries is quick and responsive since the load put on the system is minimal. During this time with minimal data the number of active users in the system at any one time is low as well. There are more inserts taking place than data retrievals. But the point of putting in one of these systems is not that the amount of data and the number of users will remain small for any length of time; most organizations bought their software and underwent the installation and configuration process with the goal of creating a vast repository that would make access to their content easy and quick for a large community within their organization.

It is when the system has finally been adopted fully and has enough content to be useful that the undersized database server will start to become a problem for the return of data to the users. I have seen systems where the input of new data is scheduled to go in during the off hours to compensate for the inability of the database server to perform inserts as well as retrievals during business hours. By this time the money for the original project is long since gone and often times the stake holders have forgotten about the decision to use a less than desirable database server configuration as a short term solution during the software’s “Pilot phase”.Continue reading

Planning Applications and Repositories within Oracle IPM

One of the philosophies I have developed over the years is the separation of applications and their image repositories for every application developed. I build a storage volume and a storage class incorporating the volume for each individual application. These do not need to be physically distinct locations, they can be virtual names for the same location such as a Centera device or they can be two different folders on a SAN drive. The key is being able to reference them individually within the IPM product.

The advantage I have found is if there comes a time when the images must be separated, or different purge cycles are desired, these actions can be accomplished from the standard Oracle IPM interface. Also, the individual application could be spun off into its own installation with some work by a IPM professional.

The short term cost in time of building these extra storage class and volume references in my opinion is well worth the potential of not having limited your future actions with the system.

Jeff Doyle

Senior Systems Engineer

ImageSource, Inc.

Scanning Software Configuration Best Practices

With scanning software it is tempting to get carried away and create many different batch class configurations designed to be optimized for specific types of documents. The most common is when you see a batch class created for single page documents and another which uses exactly the same indexes and feeds the same application for multiple page documents. Often times the extra time spent separating the documents out into these multiple piles for input to the different batch classes offsets the time saved by the batch class optimization. In organizations which enact this you often find that the single page batch class has been abandoned as the workers tasked with document preparation, scanning and indexing come to the same conclusion. This is not an absolute but in general unless there are a large number of single page documents the extra batch class is not worth the effort. The follow up to this is when you do have one of these abandoned batch classes it should be deleted. It makes for less confusion for new employees being trained for scanning operations, and does away with wasted time spent on the non used batch classes being upgraded and tested during system upgrades.

Jeff Doyle
Senior Systems Engineer
ImageSource, Inc.

Oracle IPM Workflow Software: Overuse of Custom Forms

With a product such as Oracle IPM software the temptation arises to develop a new front end to feed an existing software solution the client may have in place. The IPM product has a GUI which allows easy rule based routing, e-mail notification, trigger events and others with no programming skill required. The solution does allow for considerable development to take place as well, which is where the temptation to over develop can occur.

The workflow is powerful in aiding the processing of incoming content, be it Web based forms submitted or scanned documents, and it provides the tools to facilitate data entry.  Its ability to create custom forms allows for data entry to occur solely within the IPM platform. Care should be exercised though, since it should never be the goal to simply substitute an existing data entry method with a new face. This leads usually to excessive coding of the workflow form in trying to mimic all the functions and rules of the existing product, creating a bond between the two products which will require more effort in the future when the platforms undergo upgrades.

In many instances the workflow will be at its best when it is used as a simple delivery mechanism for the content, this will give the organization the advantages of an electronic workflow and minimize if not eliminate the need for custom coding. In other instances there is a happy medium; usually this is when a separate mechanism such as Kofax KTM has been used to extract data or if the input mechanism is a web form where the data has been captured already. In these instances a Workflow form and a script event which subsequently uploads the data after review can be extremely efficient; again the caution needs to be made to not try to recreate an already existing program

The delineation between how much custom code is enough and how much custom code is too much can be tough to ascertain, but with experience and some common sense evaluation on how much gain is really being realized for the end users experience in processing a balance should be obtainable, cutting development costs and time to implement for your solutions.

Jeff Doyle
Sr. Systems Engineer
ImageSource, Inc.