Moving your Records to the Cloud, a Checklist

by Frank 15. February 2017 06:00

You or your boss have decided to move your records management processing to the Cloud, that is, to a Cloud based records management solution.

Typical Scenario

Currently, you run a legacy records management system on old servers somewhere in the computer room. You are aware that the records management software you are running is old and out of date and no longer supported. You also suspect that the server and operating system and databases software are similarly old and out of date. You also have no confidence in the backups and don’t think your server is included in any Disaster Recovery Plan.

The boss recently attended a risk management seminar and came back full of enthusiasm and focussed on minimizing processing risks. Yours records management system was identified as a big risk because you are responsible for 1.5TB of company data, documents and emails going back 20 years. The boss delegated to you and said, “Get it done!” Where do you start?

You could just call up a selection of records management software vendors and ask them to provide quotations but without prior research and preparation on your part, what you receive back will not be apples to apples. Each vendor will see the problem differently and you will spend a lot of time trying to answer a plethora of often confusing questions. There will be no clear conclusions and it will be difficult to make a selection of vendor or even know what you will end up with.

Take Advantage of the Opportunity

Alternatively, as you have already decided that a new software solution is required, it is a great time to re-evaluate everything you hold and everything you do. This is the time to cull and to modernize and improve all of your business processes. Please don’t, under any circumstances, be convinced by anyone to try to transfer your in-house mess to the Cloud, that would-be anathema.

Instead, plan on instructing the vendors on how you want to go forward, not on how you process now. Do your research and culling and modernizing and produce a report before you call in the vendors.

Cull and Simplify

The first job is to research exactly what you have in your database and associated physical files both in-house and at offsite record centres. You are going to need help from someone who is still an expert in your legacy system and you are going to need help from IT when trying to analyse the contents of your database. Nevertheless, get the help you need and then produce a list of all holdings, both physical and electronic. Do your best to find out exactly what is being held by offsite storage companies.

This isn’t thankless work because if you do your job well there is the very real potential of saving your company a lot of money in both floor space and offsite storage costs. Let’s be a hero.

Use your retention schedule and obtain management decisions to cull as much as possible, both electronic and physical. If in doubt, lean towards “throw it out” rather than “let’s hold on to it just in case.” If you haven’t had cause to reference something in 7 plus years, it is extremely unlikely that you ever will so, as you walk around the filing areas, repeat this mantra under your breath, “If in doubt, throw it out!”

Now look at your business processes, how old and manual and inefficient are they? For example, do end users have to fill in forms and submit them to records when trying to find something or can they just login and find it in seconds?

Please avoid the “we do it this way because we have always done it this way” syndrome. Be brave, be innovative, think outside the square; this is your time to shine! Sit down with users and ask them how they would like the new system to work. There are three magic questions you can always use to solicit the answers you need.

1       “What are we doing now that you think we shouldn’t be doing?”

2       “What aren’t we doing now that you think we should be doing?”

3       “What are we doing now that you think we can do better?”

Document your new business processes.

Produce a report

We aren’t talking about a magnum opus, all we need is a short, concise report that lists all the holdings after culling as well as your ‘new’ required business processes also suitably culled and modernized.

As we are going to provide this report to vendors to begin the quoting process we also need to include information on your operational and security requirements. You will need help here but it doesn’t really matter if your report isn’t 100% accurate, at least for now. What you are primarily interested in is getting an apples to apples response from your chosen vendors. If it later turns out that you need 60 users not 50 users or 3TB of storage rather than 2TB of storage or an average half second response time as opposed to a 1 second response you can easily get the vendors to adjust their quotes.

In other words, don’t agonize over whether or not your report is perfect (it can never be anyway) just make sure it is logical and makes sense and reflects your needs at a point in time.  You are guessing about what future usage and processing needs will be anyway because lots of things will change when the new records system is rolled out.

What to look out for

The following is a guideline, not an exhaustive or complete list. It should be a subset of your requirements.

  • Make sure the vendors understand that your data needs to be stored in the country you nominate.
  • Make sure that the records management software includes the functionality you require. Try not to be too prescriptive, leave room for the vendor to tell you how they would solve your problem with its unique solution. Be cautious about ‘optional’ features that may or may not be in your implementation.
  • Make sure the contract includes the vendor capturing and importing all your data and documents in agreed formats.
  • Make sure your system is fully redundant. Obviously, the safer it is and the more redundancy you have the higher the cost. It’s a trade-off, argue with your masters for the highest possible level of redundancy.
  • Get commitments of support that meet your needs.
  • Get commitments on planned and unplanned downtime that meet your needs.
  • Get commitments on backups that meet your needs.
  • Get commitments on bandwidth and response time that meet your needs. Remember that there are two connections to worry about; your company’s connection to the Internet and the data centre’s connection to the Internet. Be aware of possible bottlenecks.
  • Get commitments on data centre redundancy. What happens if their internet connection fails or their power fails?
  • Make sure that your data is as secure as possible. Ask them what international and government standards they meet on data security.
  • Make sure that you are able to dynamically grow or shrink your requirements; it is a foolish person who thinks he/she can accurately predict the future.
  • Make sure that there is an out clause in your contract; look carefully at any termination clauses. You want an ongoing assurance of service but you do not want to be locked in and you do not want to have to pay unfair or unreasonable penalties if you terminate.
  • Make sure that there are sensible clauses to handle disputation.
  • Make sure that your data always remains your property. Don’t allow the vendor to exercise any lien on your data in the future. Your data should always be your property and you should always have access to it no matter the circumstances.
  • Make sure that you clearly understand and agree with the billing algorithm; if it appears too complex then it is too complex. Please don’t give your accountant anything that will be a nightmare to reconcile every month. Don’t sign until you know exactly what your monthly subscription cost is going to be.

References

And finally, as always, ask for references. Other people have been down this road and it behoves you to learn from their experiences. Don’t just call them, go and visit them and spend time asking for their opinion. Use your 3 magic questions again.

1       “What did you do (moving to the Cloud) that you now think you should have done differently?”

2       “What did you do that you now think you shouldn’t have done?”

3       “What didn’t you do that you now know you should have done?”

Then it should just be a matter of selecting a vendor, agreeing a project plan and making it happen. If you have done your homework, it will be far easier than expected.

 Good luck.

The Essential Digital Records checklist

by Frank 10. February 2017 12:00

So, you have decided, or have been instructed, to digitize all your records. Now what?

Where do you start? When do you start? What do you need to get the job done?

Lists

Just as with all complex projects, you are best to start with a simple list.

List all the records that need to be digitized; by type, by volume, by current format and by location. Review this list with your peers first (double-check that you haven’t missed anything) and then with management. Ask questions of management like:

 “Do you want me to digitize all of these records regardless of how long it takes and how much it costs?”

“When do you want me to start?”

“What is the budget?”

“What extra resources can I call on?”

“When do you want this project to complete?”

 “What are the metrics that will determine if I am successful?”

These are the core questions, the ones you must ask. Your dialog with your manager will probably result in many more questions and answers depending upon the unique circumstances of your organization. However, as long as you ask these core questions and get answers you are well on the road to producing a project plan.

Management Approval and Ownership

Your project will fail unless you have a senior manager ‘owning’ and supporting it. You need a friend in high places covering your back and authorizing your actions if you are going to be successful.

IT Support

Ask your senior manager to select and appoint a senior IT resource to be your IT point man. You are going to need IT support throughout the project and you need to know before you accept the project that someone senior will be appointed as your IT liaison person. Without readily accessible and committed (to the project) IT support you will fail.

The Project Plan

All project plans begin with multiple lists, for example: a list of all the tasks to be completed, a list of all the people who will work on the project, etc. Most importantly, you need to sort the tasks in order of prerequisites – i.e., we have to complete Task A before we can begin Task B. You also need to have sub-lists for each project employee listing their relevant expertise or capabilities; not everyone is equal. Some people can complete a particular task, some can’t. Some will take a day to complete a task others may take 3 days to complete the same task. You need to be well aware of capabilities before you assign tasks to individuals.

You need a good tool to document and manage your project plan because project plans are complex and dynamic. Never in the history of the world has there been a static project plan. About the last thing you want to change however, is the agreed (with your boss) completion date. Your main objective should always be to complete on time and your second objective should be to complete on budget. If you don’t have a project management system, try Microsoft Project, it’s low cost and relatively easy to use and it can do the job.

Human Resources

If your boss expects you to be responsible for the new records digitization project as well as your normal job you have the beginning of a big problem. If the boss expects you to complete the project without having any assistance your problem is probably terminal. You will need help probably both from within your organization and outside your organization because it is unlikely that you will have all the expertise you need within your organization.

Make sure that your agreement with your boss includes the additional human resources you need to be successful.

Software

It is unlikely that you will already have the software tools you need to be successful.  Basically, the software tools you need are required to capture, digitize, store and retrieve your records. Because records come in multiple formats you will need to ensure that you have the necessary software tools for each format of record to be captured. Refer to the initial list you compiled of records to be digitized by type, by volume, by current format and by location. Make sure that you have a software tool for each type of record. For example, scanning and indexing software to capture paper records.

Most importantly, make sure that you have a secure, scalable image and data repository to store and manage all of your digital records. This will usually be a structured database based on systems such as Oracle or SQL Server.

There is little point in digitizing your records if they can’t be centrally accessed, managed, searched and retrieved.

Hardware

Software requires appropriate hardware. Make sure that you have permission and budget to acquire the prerequisite hardware such as servers, workstations, scanners, etc. You will probably need help from your IT department in defining exactly what is required.

Management

Your job is to manage not facilitate. As project manager, you accept responsibly for both success and failure. Your job is to make things happen. Your job is to continually review progress, to identify and remove roadblocks. Your job is to keep all project staff focussed and on mission. It is a lot of work and a lot of effort and sometimes, a lot of frustration. You have to be prepared to regularly consult with both project staff and users. You have to be prepared to make tough decisions. You have to be committed and focussed on success but not stubborn. Sometimes it is better to give a little to win a lot. Always focus on the end result, completing the digitization project on time and on budget.

Success or Failure

There are absolutely no good technical reasons for failure. The expertise, hardware and software required to digitize all of your records is readily available from a plethora of vendors. Furthermore, there are plenty of examples both good and bad in the market for you to learn from. There is no record that can’t be digitized. The only difference between success and failure is you and your initiative, creativity and commitment.

The secret to increased productivity

by Frank 4. August 2015 06:41

By nature I am and have always been a sequencer and an overlapper. It comes naturally to me. It is how I process everyday events. For example, in the morning when making a pot of tea I first fill up the jug and turn it on to boil before emptying and cleaning the teapot. This is because I want the two tasks to overlap for maximum efficiency. If I emptied and cleaned the teapot first before filling up the jug and turning it on, the elapsed time required to make a pot of tea would be longer and therefore inefficient. With my method I save time because the total elapsed time to make a pot of tea is how long it takes to fill the jug and get it to boil. I correctly sequence and overlap the two events to be more productive. It also helps me to get to work on time.

Here is another simple example. Have you ever been in a restaurant and watched with frustration as the waiter brought out meals and then returned to the kitchen without picking up your dirty dishes? Then watch in frustration again as the waiter comes out to pick up dirty dishes but leaves someone’s lunch at the kitchen counter getting cold? Why doesn’t the waiter pick up dirty dishes, or take your order, on the way back to the kitchen? Life, business and government is full of such everyday examples of non-overlapping, poorly sequenced processes all resulting in lower productivity and higher costs for everyone.

The worst example of all is when employees are allowed to tightly redefine their jobs concentrating more on “this is what I don’t do” instead of “this is what I do”. For these employees, the terms ‘multi-skilling’ and ‘multi-tasking’ are anathema. I envision them standing within a tiny, tight circle where anything outside of that circle is not their responsibility. We may as well brick them up inside a chimney. These are not the kind of employees or practices I want in my business or our public service or our government for that matter. Unfortunately, these are exactly the kind of ant-productivity practices we find throughout our public sector and our government. As most of us are already more than well aware, the problem is more than endemic; it is systemic and probably not fixable short of a revolution. It is no secret why our taxes are so high and getting higher all the time.

Many years ago when I was a trainee programmer I learnt all about overlap while being trained at IBM. The patient instructor made the point that computers only seem to do multiple things at the same time. In fact, the architecture of computer processing at that time meant a computer could only process one command at a time but in making use of overlap and time-sharing it appeared as if it was doing many things at once. For example, the IBM 360 processor would issue an I/O command to a channel to go off and read a record from a disk drive. Relatively speaking, this took an enormity of time because disks were so slow compared to the CPU. So instead of waiting for the channel to complete the I/O request the processor would process other work all the time waiting for the channel to interrupt it and say “I am finished, here is the data you asked for”. So the computer appeared to be doing multiple tasks at once because it correctly sequenced the tasks it had to perform and took full advantage of overlap. Therein lies a lesson for all of us.

When faced with a list of tasks to perform first think about the opportunities for overlap. Then sequence the tasks to take maximum advantage of overlap.  

All it requires is the desire to work smarter, a little thought and a sense of pleasure in making best use of the limited time life allows us all.

In my role as a designer of computer software I always try to take advantage of sequencing and overlap. In my business, the two terms most used when implementing this approach are asynchronous events and multi-threading. These two techniques should always be applied when a list of tasks to be performed is not sequential. That is, they don’t have to be completed one after the other in a strict sequence. We take advantage of the fact that some tasks are independent and therefore can be processed at the same time we process other tasks. We do this in various ways but usually by defining them as asynchronous events and by utilizing a form of multi-tasking or multi-threading (starting two or more events at the same time). Computers aren’t smart (at least not yet) and they rely totally on human programmers to make them behave in an efficient and ‘smart’ way. Computer programmers who don’t understand sequencing and overlap can write very bad and very slow programs even to the extent of making very fast computers look very slow. Then, they waste everyone’s time and become major contributors to the anti-productivity movement.

There is an enormous amount of money being invested today in the science of longevity; in trying to find ways to make it possible for people to live longer lives. When the solution becomes available it won’t be cheap and it won’t be available to ordinary people like you and me. It will initially only be available to the elite and to the very rich. However, don’t despair; there is a low-cost way to double the amount of time you have to enjoy life. An easy and available now way to double your life span.

All you have to do is be aware of the possibilities of sequencing and overlap in your life and then work to take advantage of them. If you reduce the amount of time you take to do ‘work’ every day by fifty, forty, thirty or even twenty-percent you are adding years to the time you have to live and enjoy life. It is the easiest and lowest cost way to increase your effective life span.

For example, don’t try to impress your boss by working longer hours; arriving first and leaving last (as my generation did). Instead, impress your boss with a proposal whereby you do more work in fewer hours. You of course need to quantify your proposal and add in some metrics so your increased productivity can be measured and proven.

Please don’t waste your time and your effective life span by pondering ways to avoid work; instead, utilize those same cognitive processes to work out how to complete your assigned work in the fastest way possible. Approach every project looking for ways to better sequence tasks and take advantage of overlap. Make it a game; enjoy it.

I was once told that the average pattern of a human life is eight-hours work, eight-hours sleep and eight-hours play. Of course, with commuting, it is really now more like eight-hours sleep, ten-hours work and six-hours play. Let’s try and double those play hours.

As I am fond of saying, it isn’t rocket science. It is just common sense, a very simple and achievable way to significantly increase your effective life span; the time available to you to enjoy life. Give yourself twice as much time to enjoy life and in doing so, live twice as long. 

Increased productivity doesn’t just provide benefits to the economy; it can also provide very substantial personal benefits. Why don’t you give it a try?

The absolute easiest & lowest cost way to meet all Electronic Document & Records Management (EDRMS) requirements?

by Frank 19. May 2015 06:00

Because we are a software vendor that builds and markets a range of Enterprise Content Management tools under the RecFind 6 banner I have often been asked, “What is the absolute easiest and lowest cost way to meet all compliance requirements?”

I usually respond with a well-considered and ‘traditional’ response that includes information about Business Classification Systems, UI design, Retention Schedules, etc., etc. The solution proposed would also require a significant degree of consulting so that we aware entirely conversant with the customer’s requirements and business practices and also involve a significant amount of end user training.

This is what the customer expects and it falls in line with the traditional, professional approach.

However, the final solution is rarely ‘easy’ or ‘low-cost’ primarily because it has followed the traditional approach. The more we ask questions and consult and the more people we speak to the more complex the solution becomes. This is normal because we end up trying to configure a solution to meet hundreds or thousands of variables.

There is an easier and lower cost way but I fear that very few of my customers would ever consider it because it requires them to disregard everything they have ever learned about rolling out an EDRMS. We have tried proposing it a few times but never with success. It usually gets shot down by the external consultant or the internal records management professional or both.

It doesn’t require a BCS or a Taxonomy and it doesn’t require a complex Retention Schedule and it doesn’t require significant consulting or significant end-user training. Records Management professionals will surely hate it as a ‘career-ending’ trend. It does require an open mind, the ability to think laterally and a willingness to redefine both the problem and the solution.

It only has three requirements:

  1. Know what electronic documents and emails you don’t want to capture;
  2. Provide a powerful but easy-to-use search that allows anyone to find anything instantly; and
  3. Employ a risk-management approach to retention and select a single retention date (e.g., 7 or 20 years).

Fundamental to the success of this non-conformist solution is the acceptance that computers and storage are dirt-cheap compared to human time. If your IT manager or CIO still agonizes and complains about how much disk space you use up for emails and electronic documents then this is definitely not the solution for you. Your IT hierarchy is still living in the long-gone past when computers and disk were expensive and people were cheap (by comparison).

However, if you have practical, sensible IT people then the approach is worth considering especially if your organization has a long history of failing to digitize its records and automate its processes. That is, you have tried at least once to roll out an organization-wide EDRMS and have failed and/or blown the budget. The word ‘pilot’ probably appears often in your company history usually prefixed by the adjective ‘failed’. Don’t feel too bad, most pilots are initiated because management lacks conviction. They are therefore destined to fail.

We have the tools required to implement such a solution but I won’t go into detail about them now. This is a concept paper, not a detailed instruction manual. If you are interested in the concept please contact me and I can then elaborate.

So, if you really do want to rollout a successful EDRMS and do it in the fastest and least disruptive and lowest cost way possible then please write to me and pose your questions.

For the doubters, this is the same way we manage our electronic documents and emails at Knowledgeone Corporation and we have done so for many years. We use our own software; apart from a couple of accounting packages we run our whole company with the RecFind 6 Product Suite and totally automate the capture of all electronic documents and emails. All my staff have to know is how to search and yes, they can find anything in seconds even after 31 years of operation and a very, very large database.

It is not difficult, it is not ‘expensive’, it does not require a huge amount of management or maintenance time and it runs largely in the background. As I said above, all your staff have to learn is how to search.

It does however, require an open mind and a desire to finally solve the problem in the most expeditious manner possible. But, please don’t tell me you want to run a pilot. Test my solution by all means and put it through the most vigorous change control procedures but don’t damn the end result by beginning with a “we are not really sure it will work so are not really committed and won’t allocate much of a budget but let’s try a pilot anyway because that limits our exposure and risk” approach.

I don’t want to waste your time or mine.

How to simplify electronic document and email management

by Frank 17. September 2014 06:00

I have written about this topic many times in the past (see links at the end of this post) but the lesson is always the same. There are two key rules:

1.     If your system relies on people being 100% consistent and reliable it won’t work; and

2.     If you system places an additional workload on already busy workers it won’t work.

The message is, if you simplify and automate your system you give it the best possible chance of working.

If your system works as automatically as possible and doesn’t require much effort from your workforce then it has the best possible chance of being successful.

With today’s technology and tools there is simply no need to burden your workforce with capture and classification tasks. Do you still see people still using typewriters, rotary phones or Morse code? No you don’t because there is much better technology available. So why do you persist with an old, outdated and unsuccessful model? Why do you ask your staff to manually capture and classify electronic documents and emails when there are much better, much faster, much more consistent and much more reliable ways to achieve a better result? It is after all 2014, not 1914; we all use computers and smart phones now, not typewriters, wind-up rotary phones and Morse code.

Emails are managed by email servers, (yes, even Google). Email servers allow plug-ins and add-ons and are ‘open’ so you can automatically monitor and capture incoming and outgoing emails.

Electronic documents are always saved somewhere, for example on your shared drives or directly into your DMS. As such they can be captured and interrogated programmatically.

It is entirely possible to ‘parse’ any electronic document or email and its associated attributes and Metadata and make consistent decisions about whether or not to capture it and how to classify it when captured. It isn’t rocket science any more, it is just analysis, design and programming. We can go even further and determine who should be notified and what action(s) need to be initiated in response to each new email or electronic document.  

We can easily implement an end-to-end business process whereby every electronic document and email is managed from creation to destruction and we can do this with minimal human involvement. Where human involvement is required, for example making a decision or deciding upon an appropriate response, we can also automate and manage the business processes required and simply ‘present’ staff with all the required information when required.

Isn’t this was the Knowledge Management revolution was supposed to be about?

“A system that provides the user with the explicit information required, in exactly the form required at precisely the time the user needs it.”

The new model is all about automation and processing at the server rather than at the user’s workstation; a fully automatic, server-centric paradigm. A system that is all about the ‘Push’ rather than the ‘Pull’ model. A model whereby the computer services the end user, where the end user is not a slave to the computer.

We could also call it management by exception. “Please only give me what I need to see when I need to see it.”

None of the above is new or revolutionary thinking, it is all just common sense. None of the above requires yet-to-be invented technology or products, it only requires existing and proven technology and products.

The fully-automatic, server-centric approach should be the default choice and it should be a no-brainer for any organization that needs to implement an email and document management regime. Unfortunately, too often it isn’t.

If you have the responsibility of rolling out an email and document management system and the fully-automatic, server-centric approach isn’t on your agenda then your boss should be asking you why not.

References:

White papers

Posts

How to clean up your shared drives, Frank’s approach

by Frank 22. August 2014 06:00

In my time in this business (enterprise content management, records management, document management, etc.) I have been asked to help with a ‘shared drive problem’ more times than I can remember. This particular issue is analogous with the paperless office problem. Thirty years ago when I started my company I naively thought that both problems would be long gone by now but they are not.

I still get requests for purely physical records management solutions and I still get requests to assist customers in sorting out their shared drives problems.

The tools and procedures to solve both problems have been around for a long time but for whatever reason (I suspect lack of management focus) the problems still persist and could be described as systemic across most industry segments.

Yes, I know that you can implement an electronic document and records management system (we have one called RecFind 6) and take away the need for shared drives and physical records management systems completely but most organizations don’t and most organizations still struggle with shared drives and physical records. This post addresses the reality.

Unfortunately, the most important ingredient in any solution is ‘ownership’ and that is as hard to find as it ever was. Someone with authority, or someone who is prepared to assume authority, needs to take ownership of the problem in a benevolent dictator way and just steam-roll a solution through the enterprise. It isn’t solvable by committees and it requires a committed, driven person to make it happen. These kind of people are in short supply so if you don’t have one, bring one in.

In a nutshell there are three basic problems apart from ownership of the problem.

1.     How to delete all redundant information;

2.     How to structure the ‘new’ shared drives; and

3.     How to make the new system work to most people’s satisfaction.

Deleting redundant Information

Rule number one is don’t ever ask staff to delete the information they regard as redundant. It will never happen. Instead, tell staff that you will delete all documents in your shared drives with a created or last updated date greater than a nominated date (say one-year into the past) unless they tell you specifically which ‘older’ documents they need to retain. Just saying “all of them” is not an acceptable response. Give staff advance notice of a month and then delete everything that has not been nominated as important enough to retain.  Of course, take a backup of everything before you delete, just in case. This is tough love, not stupidity.

Structuring the new shared drives

If your records manager insists on using your already overly complex, hierarchical corporate classification scheme or taxonomy as the model for the new shared drive structure politely ask them to look for another job. Do you want this to work or not?

Records managers and archivists and librarians (and scientists) understand and love complex classification systems. However, end users don’t understand them, don’t like them and won’t use them. End users have no wish to become part-time records managers, they have their own work to do thank you.

By all means make the new structure a subset of the classification system, major headings only and no more than two levels if possible. If it takes longer than a few seconds to decide where to save something or to find something then it is too complex. If three people save the same document in three different places then it is too complex. If a senior manager can’t find something instantly then it is too complex. The staff aren’t to blame, you are.

I have written about this issue previously and you can reference a white paper at this link, “Do you really need a Taxonomy?”

The shared drives aren’t where we classify documents, it is where we make it as easy and as fast as possible to save, retrieve and work on documents; no more, no less. Proper classification (if I can use that term) happens later when you use intelligent software to automatically capture, analyse and store documents in your document management system.

Please note, shared drives are not a document management system and a document management system should never just be a copy of your shared drives. They have different jobs to do.

Making the new system work

Let’s fall back on one of the oldest acronyms in business, KISS, “Keep It Simple Stupid!” Simple is good and elegant, complex is bad and unfathomable.

Testing is a good example of where the KISS principle must be applied. Asking all staff to participate in the testing process may be diplomatic but it is also suicidal. You need to select your testers. You need to pick a small number of smart people from all levels of your organization. Don’t ask for volunteers, you will get the wrong people applying. Do you want participants who are committed to the system working, or those who are committed to it failing? Do you want this to succeed or not?

If I am pressed for time I use what I call the straight-line-method. Imagine all staff in a straight line from the most junior to the most senior. Select from both ends, the most junior and the most senior. Chances are that if the system works for this subset that it will also work for all the staff in between.

Make it clear to all that the shared drives are not your document management system. The shared drives are there for ease of access and to work on documents. The document management system has business rules to ensure that you have inviolate copies of important documents plus all relevant contextual information. The document management system is where you apply business rules and workflow. The document management system is all about business process management and compliance. The shared drives and the document management system are related and integrated but they have different jobs to do.

We have shared drives so staff don’t work on documents on ‘private’ drives, inaccessible and invisible to others. We provide a shared drive resource so staff can collaborate and share information and easily work on documents. We have shared drives so that when someone leaves we still have all their documents and work-in-process.

Please do all the complex processes required in your document management system using intelligent software, automate as much as possible. Productivity gains come about when you take work off staff, not when you load them up with more work. Give your staff as much time as possible so they can use their expertise to do the core job they were hired for.

If you don’t force extra work on your staff and if you make it as easy and as fast as possible to use the shared drives then your system will work. Do the opposite and I guarantee it will not work.

A simple guide to using shared drives to capture & classify electronic documents and emails

by Frank 18. July 2014 06:00

I have written previously about ways to solve the shared drives problem (click here) and I have written numerous articles (and a book) about ways to manage emails and electronic/digital records. However, we still receive multiple requests from customers and prospective customers about the best, and simplest, way to effectively manage these problems.

The biggest stumbling block and impediment to progress in most cases is the issue of a suitable taxonomy or classification system. Time and time again I see people putting off the solution while they spend years and tens of thousands or hundreds of thousands of dollars grappling with the construction of a suitable taxonomy. I have written about this topic previously as well and if you want my recommendations please click on this link.

If you really want the simplest, easiest to understand, easiest to use and lowest cost way to solve all of the above problems then please forget about spending the next twelve to eighteen months grappling with the nuances of your classification system. It isn’t necessary.

What you need instead is a natural classification structure that reflects your business processes. Please give your long-suffering end users something they will instantly recognize and can easily work with because it is familiar from their day to day work. Give them something to work with that doesn’t require them to become amateur records managers battling to decipher a complex, hierarchical classification system that requires an intricate knowledge of classification theory to interpret correctly. Give them something that makes it as easy as possible to file everything in the right place first time with absolutely minimal effort. Give them something that makes it as easy as possible to find something.

What I am proposing isn’t a hundred-percent solution and it won’t suit every organization but I guarantee that it will turn chaos into order in any organization that implements it. You may well see it as an eighty-five-percent solution but that is a hell of a lot better than no solution. It is also easy and fast to implement and relatively low cost (you will need some form of RM software).

First up you need to make decisions about what kind of business you are.  Notice that I said “what kind of business you are” not “what kind of records you manage” or “how your business is structured”.  Most importantly, strongly resist the temptation to base your classification structure on your existing business structure or organization’s departments/agencies and instead base it on your most common business processes. Please refer to the following extract from:

Overview of Classification Tools for Records Management by the National Archives of Australia, ISBN 0 642 34499 X (an excellent reference document if you need to understand classification systems).

“Classifying records and business information by functions and activities moves away from traditional classification based on organisational structure or subject. Functions and activities provide a more stable framework for classification than organisational structures that are often subject to change through amalgamation, devolution and decentralisation. The structure of an organisation may change many times, but the functions an organisation carries out usually remain much the same over time.”

I would also strongly resist the temptation to build your classification structure on content; it is way too difficult. Instead, as I have said above, base it on your common business processes.

When I say classification structure I mean the way you name and organize folders in your shared drives. I can’t give you a generic solution because I am not that clever; I don’t know enough about your business. I can however, give you an example.

Please also remember that for the most part, we are dealing with unstructured source information; Word, Excel, PowerPoint, Emails, etc. Emails are a little easier to deal with because they have a limited but common structure, e.g., Date Received, Sender, Recipient, CC and Subject. With other electronic documents we are have far less information and are  usually limited to Author (not reliable), Date Created, Date Modified and Filename. Ergo, as I said earlier, trying to base a classification system on the content of unstructured documents is both difficult and inexact. It is certainly doable but you will have to spend a lot more money on consulting and sophisticated software to achieve your ends.

In my simple example of my simple system I am going to assume that your business is customer (or client) centric, i.e., as opposed to being case-centric or project-centric, etc. The top level of your classification structure therefore will be the client name and/or number. To make it as simple as possible I am going to propose only two levels. The second level represents your most common business processes, that is, what you do with each customer. So for example, I have:

Customer Name

     Correspondence

     Contracts

     Quotes & Proposals.

     Orders

     Incidents

I am also not going to differentiate between emails and other types of electronic documents, I am going to treat them all the same.

Now how does this simple system work?

  1. Staff producing electronic documents don’t have their ‘own’ shared drive, all staff use the common classification structure. This is very important, let one or more people be exceptions and you no longer have a system you can rely on to meet your needs for reliable retrieval and any compliance legislation you are subject to.
  2. Staff drag and drop or ‘save-as’ emails from their email client to the correct sub-folder.
  3. Similarly, staff save (or drag and drop) electronic documents into the correct sub-folder. You can control access if required by applying security to electronic documents.
  4. You purchase or build a document repository (based on any common database such as SQL Server, MySQL, etc.) and within this repository you replicate the folder structure of your shared drives with logical folders and subfolders.
  5. You purchase or build a tool that constantly monitors the shared drives (e.g., using .NET Watcher technology) and that instantly captures a copy of any new or modified document (you do need to configure your repository to automatically version modified documents). You may also decide to automatically delete the original source document after it has been captured.
  6. You build or purchase a records and document management software package that allows you to index, search and report on all the information in your repository.
  7. You train your staff in how to save and search for information (shouldn’t take more than a half to one day) and then you go live.

I would also recommend applying a retention schedule based on sub folder (e.g., contracts) and date created and have the records management system automatically apply it to manage the lifecycle of captured documents. There is no sense in retaining information longer than you have to; it is also a dangerous practice.

Please note that the above is just an example and a very simple one at that. You need to determine the most appropriate folder structure for your organization.

WARNING

Do not let the folder structure become overly complex and unwieldy. If you do, it won’t work and you will end up with lots of stuff either not captured or captured to the wrong place. The basic rules are that if it takes more than few second to decide where to file something then it is too complex and that any structure more than 3 levels deep is too complex.

And finally, this isn’t just a theory, it is something we do in our organization and it is something many of our customers do. If you would like to read more on this approach there are some white papers and more explanations at this link. Alternatively, you can contact us and ask questions at this link.

Good luck.

 

Are you still struggling with physical records management, with paper?

by Frank 16. July 2014 00:01

 

Are you still struggling with physical records management, with paper?

We produced our first computerised records management system in 1984 (when our company was called GMB) and it was called DocFind. It was marketed by the Burroughs Corporation initially to about 100 clients and then we stared marketing DocFind direct and sold it to about another 2,000 clients.

Every one of those clients wanted DocFind just to manage physical records, paper, file folders and archive boxes. There was little or no demand for document imaging and workflow and the term electronic document management had yet to be invented. Office automation was in its infancy. We for example, wrote our letters on an Apple IIe using a word processor called WordStar running under CP/M.

In 1986 we released RecFind, a major remake of the DocFind product. This product was initially marketed by ourselves and NEC and it too focussed just on managing physical records.

However, even in 1986 we knew we had a bigger job to do with the general acceptance of document scanners and workflow so we added imaging and workflow to our product and starting trying to convince our customers and prospective customers to reduce the size of their paper mountain and even to start planning for a ‘Paperless Office’.

In the late 1980s and early 1990s I delivered numerous papers extolling the value of the paperless office and worked hard to convince my customers to make the move to Electronic Document and Records Management (EDRMS).

In the mid-1990s the industry discovered ‘Knowledge Management’ (KMS) and industry consultants lost interest in EDRMS and instead heavily promoted the virtues and benefits of KMS, whatever it was. Maybe this was the time organizations lost interest in eradicating paper as senior IT staff and consultants moved on to more interesting projects like KMS.

In 1995 I delivered my first paper on a totally integrated information management system or what I called at the time the ‘It Does Everything Application’ (IDEA). In 1995 I truly thought the age of physical records management was almost over and that the western world at least would move to fully-automated, paperless processes.

How wrong I was 19 years ago.

Today, despite the advanced functionality of our RecFind 6 Product Suite, almost all of my customers still manage physical records with RecFind 6. At least half of the inquiries that come in via our website are for systems to manage physical records.

There is more paper in the world today than there has ever been and organizations all over the world still struggle with managing paper, vast amounts of paper.

Luckily for us, we never succumbed to the temptation to remove the paper handling features from our products. Instead, we added to them with each subsequent release and redesign/rewrite of RecFind. We had to provide upwards compatibility for our clients as they still managed mountains of paper both onsite and offsite.

Being a little older and wiser now I am never again going to predict the paperless office. I will provide advanced physical records management functionality for my clients as long as they require it.

I haven’t given up the fight but my job is to address the real needs of my customers and they tell me and keep telling me that they need to manager paper, mainly file folders full of paper and archive boxes full of file folders. They need to manage paper onsite in shelving and offsite in warehouses with millions of boxes and we do it all.

We manage paper from creation to destruction and throughout the whole lifecycle. We apply retention schedules and classification systems and we track anything and everything with barcodes and barcode readers. We have enhanced our products to cater for every need and we are now probably responsible for millions of tonnes of paper all over the world.

I still hope for a paperless world but I very much doubt that I am going to see it in my lifetime.

So, if you are still struggling with how to best manage all your physical records please don’t despair, you are most certainly not alone! 

  

What is the future of RecFind? - The Product Road Map

by Frank 19. May 2014 06:00

First a little history. We began in 1984 with our first document management application called DocFind marketed by the then Burroughs Corporation (now called Unisys). In June 1986 we sold the first version of RecFind, a fully-featured electronic records management system and a vast improvement on the DocFind product. Then we progressively added document imaging then electronic document management and workflow and then with RecFind 6 a brand new paradigm and an amalgam of all previous functionality; an Information management system able to run multiple applications concurrently with a complete set of enterprise content management functionality. RecFind 6 is the eighth completely new iteration of the iconic RecFind brand.

RecFind 6 was and is unique in our industry because it was designed to be what was previously called a Rapid Application Development system (RAD) but unlike previous examples, we provided the high level toolset so new applications could be inexpensively ‘configured’ (by using the DRM) not expensively programmed and new application tables and fields easily populated using Xchange. It immediately provided every customer with the ability to change almost anything they needed changed without needing to deal with the vendor (us).  Each customer had the same tools we used to configure multiple applications within a single copy of RecFind 6. RecFind 6 was the first ECM product to truly empower the customer and to release them from the expensive and time consuming process of having to negotiate with the vendor to “make changes and get things done.”

In essence, the future of the RecFind brand can be summarised as more of the same but as an even easier to use and more powerful product. Architecturally, we are moving away from the fat-client model (in our case based on the .NET smart-client paradigm) to the zero-footprint, thin-client model to reduce installation and maintenance costs and to support far more operating system platforms than just Microsoft Windows. The new version 2.6 web-client for instance happily runs on my iPad within the Safari browser and provides me with all the information I need on my customers when I travel or work from home (we use RecFind 6 as our Customer Relationship Management system or CRM). I no longer need a PC at home and nor do I need to carry a heavy laptop through airports.

One of my goals for the remainder of 2014 and 2015 following is to convince my customer base to move to the RecFind 6 web-client from the standard .NET smart-client. This is because the web-client provides tangible, measurable cost benefits and will be the basis for a host of new features as we gradually deprecate the .NET smart-client and expand the functionality of the web-client. We do not believe there is a future for the fat/smart-client paradigm; it has seen its day. Customers are rightfully demanding a zero footprint and the support of an extensive range of operating environments and devices including mobile devices such as smartphones and tablets. Our web-client provides the functionality, mobile device support and convenience they are demanding.

Of course the back-end of the product, the image and data repository, also comes in for major upgrades and improvements. We are sticking with MS SQL Server as our database but will incorporate a host of new features and improvements to better facilitate the handling of ‘big data’. We will continue to research and make improvements to the way we capture, store and retrieve data and because our customer’s databases are now so large (measured in hundreds of Gigabytes), we are making it easier and faster to both backup and audit the repository. The objectives as always are scalability, speed, security and robustness.

We are also adding new functionality to allow the customer to bypass our standard user interface (e.g., the .NET smart-client or web-client) and create their own user interface or presentation layer. The objective is to make it as easy as possible for the customer to create tailored interfaces for each operating unit within their organization. A simple way to think of this functionality is to imagine a single high level tool that lets you quickly and easily create your own screens and dashboards and program to our SDK.

On the add-in product front we will continue to invest in our add-in products such as the Button, the MINI API, the SDK, GEM, RecCapture, the High Speed Scanning Module and the SharePoint Integration Module. Even though the base product RecFind 6 has a full complement of enterprise content management functionality these add-on products provide options requested by our customers. They are generally a way to do things faster and more automatically.

We will continue to provide two approaches for document management; the end-user paradigm (RecFind 6 plus the Button) and the fully automatic capture and classification paradigm (RecFind 6 plus GEM and RecCapture). As has been the case, we also fully expect a lot of our customers to combine both paradigms in a hybrid solution.

The major architectural change is away from the .NET smart-client (fat-client) paradigm to the browser-based thin-client or web-client paradigm. We see this as the future for all application software, unconstrained by the strictures of proprietary operating systems like Microsoft Windows.

As always, our approach, our credo, is that we do all the hard work so you don’t have to. We provide the feature rich, scalable and robust image and data repository and we also provide all of the high level tools so you can configure your applications that access our repository. We also continue to invest in supporting and enhancing all of our products making sure that they have the feature set you require and run in the operating environments you require them to. We invest in the ongoing development of our products to protect your investment in our products. This is our responsibility and our contribution to our ongoing partnership.

 

Technology Trends for 2014 – A developer’s perspective

by Frank 7. January 2014 06:00

I run a software company called the Knowledgeone Corporation and we produce enterprise content management software for government and business. Because it takes so long to design, build and test a new product or even a new version, we have to try and predict where the market will be in one or two years and then try to make sure our product RecFind 6 ‘fits-in’ with future requirements.

Years ago it was much easier because we were sure Windows would be the dominant factor and mostly we had to worry about compatibility with the next version of Windows and Microsoft Office. Apple however, changed the game with first the iPhone and then the iPad.

We now need to be aware of a much wider range of devices and operating systems; smart phones and tablets in particular. Three years ago we decided to design in compatibility for iOS and Android and we also decided to ignore Blackberry; so far, a wise move.

However, the prediction business is getting harder because the game is changing faster and probably faster than we can change our software (a major application).

I was just reading about CES 2014 on ZDNet and the major technologies previewed and displayed there. Most are carry overs from 2013 and I haven’t noted anything really new but even so, the question is which of these major trends will become major players during 2014 and 2015 (our design, develop and test window for the next major release of RecFind 6)?

1.     Wearables

2.     The Internet of Things

3.     Contextual Computing (or Predictive Computing)

4.     Consumerization of business tech

5.     3D printing

6.     Big Data

7.     The Cloud

Larry Dignan, Editor in Chief of ZDNet, wrote an excellent summary of things to think about for 2014, see this link:

Larry sees China and emerging Chinese companies as major players outside of China in 2014 but I think the Europeans and Americans will resist until well into 2015 or later. Coming on the heels of the Global Financial Crisis of 2008 their governments won’t take kindly to having their local high tech industries swamped by Chinese giants. He also talks about the fate of Windows 8 and the direction of the PC market and this is our major concern.

The PC market has been shrinking and even though Microsoft is still the major player by far a lot depends upon the acceptance of Windows 8 as the default operating system. Personally I saw the Windows 8 Metro interface as clumsy and as change for changes’ sake.

I really don’t understand Microsoft’s agenda. Why try to force a major change like this on consumers and businesses just when everyone is happy with Windows 7 and we have all almost forgotten Vista. Windows 8 isn’t an improvement over Windows 7 just as Office 2013 isn’t an improvement over Office 2010. Both are just different and in my opinion, less intuitive and more difficult to use.

Try as I might, I cannot see any benefits to anyone in moving from Windows 7 to Windows 8 and in moving from Office 2010 to Office 2013. The only organization benefiting would be Microsoft and at the cost of big disruptions to its loyal customers.

Surely this isn’t a wise thing to do in an era of falling PC sales? Why exacerbate the problem?

Smart phones and tablets are real and growing in importance. Android and iOS are the two most important ‘new’ operating systems to support and most importantly for us, browsers are the application carriers of the future. No software vendor has the resources to support all the manifestations of Windows, Linux, Android, iOS, etc., in ‘native’ form but all operating systems support browsers. Browsers have become what Windows was ten years ago. That is, a way to reach most of the market with a single set of source code.

We lived through the early days of DOS, UNIX, Windows and the AS/400 and at one time had about fifteen different sets of source code for RecFind. No vendor wants to go back to those bad old days. When the world settled on Windows it meant that most of us could massively simplify our development regime and revert to a single set of source code to reach ninety-percent of the market. In the early days, Windows was our entry point to the world. Today it is browsers.

Of course not all browsers are equal and there is extra work to do to support different operating systems, especially sand-boxed ones like iOS but, we are still running ninety five percent common source and five-percent variations so it is eminently manageable.

Does Microsoft realize that many developers like us now target browsers as our main application carriers and not Windows? Does it also realize that the Windows 8 Metro interface was the catalyst that pushed many more developers along this same path?

Let’s hope that the new CEO of Microsoft cares more about his customers than the previous one did. If not, 2014 won’t just be the post-PC era, it will also be the beginning of the post-Microsoft era.

Month List