Category Archives for "Document Management"

Oracle IPM 10g and Imaging 11g Migration

One of the things we’ve done a lot of here at ImageSource are migrations. It’s definitely one our core competencies. For a little more information on our approach to migrations, you can review my earlier post here. Lately, migrations have been more focused because the majority of them involve moving content out of the Oracle IPM 10g or Oracle Imaging 11g products. Oracle IPM 10g has reached end-of-life, and Oracle Imaging 11g was the terminal release of the product, so essentially that product line is dead. The IPM 10g product was something we worked with for many years, so we have a wealth of knowledge on its “ins and outs.” IPM was a feature-rich, but older, product stack, and it was in need of a rewrite. However, when Oracle rewrote the product as Imaging 11g, there were a lot of key and important features that didn’t make the cut.

Because of everything I’ve mentioned, businesses running on these particular Oracle ECM platforms have had to make decisions for their long term ECM vision or roadmap. I have worked with a number of clients on technology evaluations and like to help determine their roadmap, but that’s a blog post for another time. One of the key pieces to any ECM roadmap for a company performing  these ECM solution changes is the migration of their content from the systems that they are replacing. Luckily we have ILINX Export and ILINX Import to make these migrations straightforward as possible.

There are a number of options with ILINX Export, but in short we use it to export all content and metadata out of a source system for migration into any destination system. By default, ILINX Export retrieves the content from the source system in exactly the same format it was in when it was added to the original system. By exporting the content out in its native format, a customer can keep a copy of the original data, and any data manipulation or file conversions can be done downstream. ILINX Export does have the ability to convert files to PDF, but we generally recommend image conversion when importing the content into the destination system. Utilizing our knowledge of Oracle ECM, there are plenty of options when extracting the content from these products. For example:

  • Only migrate certain applications.
  • Only migrate content created after, or before, a certain date.
  • Only migrate the content that falls within certain criteria: i.e., specific business unit, a set of document types, or virtually any criteria that can be identified with the content metadata.
  • Split the content up so content that meets certain criteria goes to one destination, and content meeting other criteria goes elsewhere.
  • Retain the IPM, or Imaging, annotations. These can be flattened into the documents, but I only recommended that in certain instances. If the client is migrating to ILINX Content Store, we can migrate the annotations as an overlay into the new ILINX system.
  • There are many options for formatting the data when it its exported from the source system. ILINX Export can output the metadata to text or XML files with complete control of the format, delimiter, field order, layout, and size of those files. That flexibility allows for the creation of input files in a format that can work for just about any destination system.
  • The metadata can also be written directly to SQL to support long term storage or manipulation if necessary.
  • Schedule the export to run during off-hours to keep the load off the servers while clients are using the old system.
  • Detailed auditing of the entire process to help with reporting, compliance and troubleshooting.
  • Many more.

Once you’ve defined all the rules surrounding the migration and started execution, the next step is importing that content and metadata to the destination ECM system. For that, we use our ILINX Import product which I’ll cover in a later post. If you have any questions about ILINX Export, reach out to us for a demo or discussion.

John Linehan
Sr. Systems Engineer
ImageSource, Inc.

Transferring ILINX Release Configurations When Upgrading

Starting with ILINX Capture v6, the Release configurations are stored within the ILINX database. In ILINX Capture v5x, the ILINX Release configurations were stored in XML files on a disk. ILINX Capture called ILINX Release using a SendAndReceivedReply IXM. The change to store the settings within the ILINX database is very useful for a number of reasons: Release settings are part of the batch profile allowing for simpler migrations between environments, Release is much easier to configure, all configurations are in the database, etc. However, this change can create some extra work when upgrading from ILINX Capture 5x to ILINX Capture 6x. Because of the different architecture, ILINX Release needs to be completely reconfigured for the existing batch profiles. In addition, the Release XML doesn’t change, but there is a shortcut that can be taken. After you have upgraded ILINX Capture to v6, you’ll notice a new IXM in the palette: ILINX Release IXM Icon

The existing ILINX workflow will likely have a SendAndReceiveReply IXM on the map that the 5x version of ILINX Capture used to call ILINX Release. Most likely, it would look like this:
SendAndReceiveReply_IXMTo configure ILINX Release for ILINX Capture 6x, the SendAndReceiveReply IXM will need to be removed from the map and a Release IXM must be dragged onto the workflow map in its place. Once the new Release IXM is on the map, it will need to be configured. This is where the shortcut can be taken. Instead of having to manually enter in the correct URLs, map the metadata values, and configure any other settings, do this:
Configure and save Release with some place holder settings: I normally leave the settings at default and enter in the bare minimum:

  • Job Name
  • User Name
  • Password
  • Batch Profile
  • Release Directory

Once ILINX Release configuration is saved and the workflow map is published, there will be a new entry in the ILINX Capture database Capture WorkflowAppSettings table. The CaptureWorkflowAppSettings.SettingsXML column is where the Release configuration is stored. Now it’s time to update the SettingsXML column with the XML from the ILINX Release 5x job settings file. The Release job should be on the ILINX Release 5.x server at c:ProgramDataImageSourceILINXReleaseSettingsJobs. The only caveat here is to be sure to place single quotes around the XML content. Here is what the SQL update statement would look like:

update [ILINX CAPTURE DATABASE].[dbo]. [CaptureWorkflowAppSettings] set SettingsXml = ‘COPY AND PASTE ALL TEXT FROM 5.4 OR PRIOR RELEASE JOB SETTINGS FILE HERE’
where settingsID = ‘APPROIATE ID HERE’

Following this procedure can save some time if upgrading an ILINX Capture 5x system that has a lot of batch profiles. A lot of the time spent on the upgrade could be in the ILINX Release configuration. If I was upgrading a system with only a few batch profiles, I would probably just reconfigure them. If I was upgrading a system with a lot of batch profiles, I would go through the above steps to save some time.

John Linehan
Sr. Systems Engineer
ImageSource, Inc.

Failover Cluster Troubleshooting

There’s nothing quite like logging in to a customer’s system first thing Monday morning only to be greeted with this:

Windows Powershell Cluster Report

I discovered this when I wasn’t able to log into the customer’s ILINX Capture implementation. The logged error (failure to locate the SQL Server) led me to take a look at the SQL Server’s configuration to confirm that its service was not running on either node of the cluster, and the error I got when trying to start that (a clustered resource could not be activated) led me to check on the clustered resources themselves.
Continue reading

Implementing SQL FILESTREAM Part II

Last month I wrote about enabling SQL FILESTREAM with ILINX Content Store. After discussing this with a few people, I think I should share some more information and reiterate a couple points.

For Existing Applications:
As I mentioned before, the decision to enable FILESTREAM should be done during the planning phase. If you perform this process on an application with a lot of content, it can be a very time costly endeavor with a big performance impact to the server. Also, after the move from BLOB to FILESTREAM, you could have a fragmented database. The BLOB to FILESTREAM process can definitely be done on an existing system, just be sure to plan accordingly and allow for sufficient time.

After step #10 of my previous blog post (all the data is copied and you have deleted the BLOB column), you will notice that the database file size hasn’t decreased. This is remedied easily enough be executing a DBCC CLEANTABLE command. The DBCC CLEANTABLE command will reclaim the space from the dropped variable length column. For example, if your database is named ILINX_CS and your application is named Sample Application, the query to do this is:

DBCC CLEANTABLE ('ILINX_CS','[dbo].[Sample Application]',10000)Continue reading

Indexing Tables in Kofax-Based Environments

We recently had a customer who needed to migrate off of an aging and highly customized Capture/indexing/workflow one-off solution. At the center of many of their form types in this system was a repeatable field collection object that functioned much like how you would expect a .NET DataTable control to function – values could be added horizontally to the current “row”, and at the end of it you could hit enter and a new “row” would be added. As you moved through, you also had the ability to validate the line item as a whole. In other words, nothing too out-of-the-ordinary.

Unfortunately, this stood out as a red flag for both myself and my coworker when we first saw it, since we were migrating the client to Kofax Capture. There’s nothing inherently wrong with Kofax’s flagship product, in fact it is an excellent tool for getting content where it needs to be, often in record time. One thing it doesn’t do well out-of-the-box, however, is table fields. Defining one looks normal enough, but when you actually get the chance to index them, each column ends up being a standard index field. Needless to say, turning the table 90 degrees counter-clockwise and forcing keyers to manually delimit values is not an ideal experience, especially when 99% of your form is tables that need to be indexed.Continue reading