RF6 Connector - SharePoint Online Integration

by Frank 1. December 2021 19:30

 

Now easily integrate RF6 to SharePoint Online using our new RF6 Connector

This new product utilizes the very latest Microsoft Power Automate, Power Apps and Power BI to provide the most powerful, most seamless and most flexible integration to PowerPoint Online.

Note that you can use the RF6 Connector to connect to any system via the MS Power Platform, not just SharePoint Online.

The RF6 Connector replaces our legacy product the SharePoint Integration Module. It provides a superset of the old functionality.

The Significant Benefits of Document Imaging

by Frank 20. January 2020 12:24

 

 

We have all known for at least 30 years of the significant benefits of Document Imaging.

Document imaging or the scanning of paper documents to convert them to digital images, along with workflow, was the real beginning of office automation.

The advent of Document Imaging in the early 1980s did for office automation what barcodetechnology did for physical records management and asset management. It allowed manual processes to be automated and improved; it provided tangible and measurable productivity improvements and as well as demonstrably better access to information for the then fledgling knowledge worker.

However, over 30 years later, we have a paradox where we all take document imaging for granted but, we still don’t utilize it to anything like its full capabilities. Most organizations use document scanners of one kind or another, usually on multi-function-devices, but we still don’t appear to use document scanning nearly enough to automate time-consuming and often critical business processes.

I don’t really know why we are not utilizing document scanning more widely because it obviously isn’t a matter of lacking the tools; we have every type of document scanner imaginable and every type of document scanning software conceivable.  The struggling knowledge workers of 30 years ago would be ecstatic with the options available today.

We just seem to be stuck in first gear or, maybe we just are not applying enough thought to analyzing our day to day business processes.

Business Processes Management based on the circulation of paper documents in 2020 is archaic, wasteful, inefficient and highly prone to error. Yet, many organizations I deal with still have critical business processes based on the circulation of paper. How incredibly careless and dangerous is that?

Let’s look at the benefits at the most basic level: 

  • How many people can read a paper document at any one point in time? The answer is one and one only.
  • How many people can look at a digital image of a document at any one point in time? The answer is as many as need to.
  • How hard is it to lose or damage a paper document? The answer is it is really, really easy to lose of damage or deface a paper document.
  • How hard is it to lose or damage or deface or even change a secure digital copy of a document? The answer is it is almost impossible in a well-managed document management system.

So why are we still circulating paper documents to support critical business processes? Why aren’t we simply digitizing these important paper documents and making the business process infinitely faster and more secure? For the life of me, I can’t think of a single valid reason for not digitizing important paper documents. The technology is readily available with oodles of choice and it isn’t difficult to use, it isn’t expensive. In fact, digitizing paper will always save you time and money.

So why do I still see so many organizations large and small still relying on the circulation of paper documents to support important business processes? Is it a lack of thought or a lack of imagination or a lack of education? Can it really be true that more than thirty-years after the beginning of the office automation revolution we still have tens of thousands or even millions of knowledge workers with little knowledge of or access to, basic office automation?

In a world awash in technology like computers, laptops, iPhones and iPads how can we be so terribly ignorant of the application and benefits of such a basic and proven technology as document imaging?

In my experience, some of the worst example can be found in large financial organizations like banks and insurance companies. The public perception is that banks are right up there with the latest technology and most people look at examples like banking and payment systems on smartphones as examples of that. But, go behind the front office to the back office and you will usually see a very different world; a world of paper and manual processes, many on the IT department’s ‘backlog’ of things to attend to, eventually.

The message is a simple one. If you have business processes based on the circulation of paper, you are inefficient and are wasting money and the time of your staff and customers. You are also taking risks with the integrity of your data and your customer’s data.

Please do everyone a favor and look carefully at the application of document imaging, a well-proven, affordable, easy to implement and easy to manage business process automation tool.

 

The Records Management Standard ISO 15489 and Wikipedia as a handy reference for Records Managers

by Frank 16. December 2019 10:49

Whereas all records managers would be aware of the ISO Standard 15489, few have access to the full document because it is a proprietary and costly item.  Wherever possible, we encourage records managers to submit a request to management to secure a copy of this valuable standard.

However, if that is not possible there is a reasonably comprehensive description of the records management process on Wikipedia that all records managers can reference for free. We encourage you to access and study this reference material. It is especially relevant to those people just entering this profession.

Please see the following links:

https://en.wikipedia.org/wiki/Records_management

Contents

·        1Concepts of record

·        2Key records management terminology

·        3Records management theory

o   3.1Records life-cycle

o   3.2Records continuum theory

·        4Records management practices and concepts

o   4.1Defensible solutions

o   4.2Classification

§  4.2.1Enterprise records

§  4.2.2Industry records

§  4.2.3Legal hold records

o   4.3Records retention schedule

·        5Managing physical records

·        6Managing digital records

·        7Current issues

·        8Education and certification

·        9Electronic records management systems

o   9.1Commercial records centers

·        10See also

·        11References

·        12External links

As with all Wikipedia articles, you may not agree 100% with the content. However, as an overview and reference we find this article to be of value.

We sincerely hope that you find this information useful.

How to clean up your shared drives, Frank’s approach

by Frank 22. August 2014 06:00

In my time in this business (enterprise content management, records management, document management, etc.) I have been asked to help with a ‘shared drive problem’ more times than I can remember. This particular issue is analogous with the paperless office problem. Thirty years ago when I started my company I naively thought that both problems would be long gone by now but they are not.

I still get requests for purely physical records management solutions and I still get requests to assist customers in sorting out their shared drives problems.

The tools and procedures to solve both problems have been around for a long time but for whatever reason (I suspect lack of management focus) the problems still persist and could be described as systemic across most industry segments.

Yes, I know that you can implement an electronic document and records management system (we have one called RecFind 6) and take away the need for shared drives and physical records management systems completely but most organizations don’t and most organizations still struggle with shared drives and physical records. This post addresses the reality.

Unfortunately, the most important ingredient in any solution is ‘ownership’ and that is as hard to find as it ever was. Someone with authority, or someone who is prepared to assume authority, needs to take ownership of the problem in a benevolent dictator way and just steam-roll a solution through the enterprise. It isn’t solvable by committees and it requires a committed, driven person to make it happen. These kind of people are in short supply so if you don’t have one, bring one in.

In a nutshell there are three basic problems apart from ownership of the problem.

1.     How to delete all redundant information;

2.     How to structure the ‘new’ shared drives; and

3.     How to make the new system work to most people’s satisfaction.

Deleting redundant Information

Rule number one is don’t ever ask staff to delete the information they regard as redundant. It will never happen. Instead, tell staff that you will delete all documents in your shared drives with a created or last updated date greater than a nominated date (say one-year into the past) unless they tell you specifically which ‘older’ documents they need to retain. Just saying “all of them” is not an acceptable response. Give staff advance notice of a month and then delete everything that has not been nominated as important enough to retain.  Of course, take a backup of everything before you delete, just in case. This is tough love, not stupidity.

Structuring the new shared drives

If your records manager insists on using your already overly complex, hierarchical corporate classification scheme or taxonomy as the model for the new shared drive structure politely ask them to look for another job. Do you want this to work or not?

Records managers and archivists and librarians (and scientists) understand and love complex classification systems. However, end users don’t understand them, don’t like them and won’t use them. End users have no wish to become part-time records managers, they have their own work to do thank you.

By all means make the new structure a subset of the classification system, major headings only and no more than two levels if possible. If it takes longer than a few seconds to decide where to save something or to find something then it is too complex. If three people save the same document in three different places then it is too complex. If a senior manager can’t find something instantly then it is too complex. The staff aren’t to blame, you are.

I have written about this issue previously and you can reference a white paper at this link, “Do you really need a Taxonomy?”

The shared drives aren’t where we classify documents, it is where we make it as easy and as fast as possible to save, retrieve and work on documents; no more, no less. Proper classification (if I can use that term) happens later when you use intelligent software to automatically capture, analyse and store documents in your document management system.

Please note, shared drives are not a document management system and a document management system should never just be a copy of your shared drives. They have different jobs to do.

Making the new system work

Let’s fall back on one of the oldest acronyms in business, KISS, “Keep It Simple Stupid!” Simple is good and elegant, complex is bad and unfathomable.

Testing is a good example of where the KISS principle must be applied. Asking all staff to participate in the testing process may be diplomatic but it is also suicidal. You need to select your testers. You need to pick a small number of smart people from all levels of your organization. Don’t ask for volunteers, you will get the wrong people applying. Do you want participants who are committed to the system working, or those who are committed to it failing? Do you want this to succeed or not?

If I am pressed for time I use what I call the straight-line-method. Imagine all staff in a straight line from the most junior to the most senior. Select from both ends, the most junior and the most senior. Chances are that if the system works for this subset that it will also work for all the staff in between.

Make it clear to all that the shared drives are not your document management system. The shared drives are there for ease of access and to work on documents. The document management system has business rules to ensure that you have inviolate copies of important documents plus all relevant contextual information. The document management system is where you apply business rules and workflow. The document management system is all about business process management and compliance. The shared drives and the document management system are related and integrated but they have different jobs to do.

We have shared drives so staff don’t work on documents on ‘private’ drives, inaccessible and invisible to others. We provide a shared drive resource so staff can collaborate and share information and easily work on documents. We have shared drives so that when someone leaves we still have all their documents and work-in-process.

Please do all the complex processes required in your document management system using intelligent software, automate as much as possible. Productivity gains come about when you take work off staff, not when you load them up with more work. Give your staff as much time as possible so they can use their expertise to do the core job they were hired for.

If you don’t force extra work on your staff and if you make it as easy and as fast as possible to use the shared drives then your system will work. Do the opposite and I guarantee it will not work.

What is the future of RecFind? - The Product Road Map

by Frank 19. May 2014 06:00

First a little history. We began in 1984 with our first document management application called DocFind marketed by the then Burroughs Corporation (now called Unisys). In June 1986 we sold the first version of RecFind, a fully-featured electronic records management system and a vast improvement on the DocFind product. Then we progressively added document imaging then electronic document management and workflow and then with RecFind 6 a brand new paradigm and an amalgam of all previous functionality; an Information management system able to run multiple applications concurrently with a complete set of enterprise content management functionality. RecFind 6 is the eighth completely new iteration of the iconic RecFind brand.

RecFind 6 was and is unique in our industry because it was designed to be what was previously called a Rapid Application Development system (RAD) but unlike previous examples, we provided the high level toolset so new applications could be inexpensively ‘configured’ (by using the DRM) not expensively programmed and new application tables and fields easily populated using Xchange. It immediately provided every customer with the ability to change almost anything they needed changed without needing to deal with the vendor (us).  Each customer had the same tools we used to configure multiple applications within a single copy of RecFind 6. RecFind 6 was the first ECM product to truly empower the customer and to release them from the expensive and time consuming process of having to negotiate with the vendor to “make changes and get things done.”

In essence, the future of the RecFind brand can be summarised as more of the same but as an even easier to use and more powerful product. Architecturally, we are moving away from the fat-client model (in our case based on the .NET smart-client paradigm) to the zero-footprint, thin-client model to reduce installation and maintenance costs and to support far more operating system platforms than just Microsoft Windows. The new version 2.6 web-client for instance happily runs on my iPad within the Safari browser and provides me with all the information I need on my customers when I travel or work from home (we use RecFind 6 as our Customer Relationship Management system or CRM). I no longer need a PC at home and nor do I need to carry a heavy laptop through airports.

One of my goals for the remainder of 2014 and 2015 following is to convince my customer base to move to the RecFind 6 web-client from the standard .NET smart-client. This is because the web-client provides tangible, measurable cost benefits and will be the basis for a host of new features as we gradually deprecate the .NET smart-client and expand the functionality of the web-client. We do not believe there is a future for the fat/smart-client paradigm; it has seen its day. Customers are rightfully demanding a zero footprint and the support of an extensive range of operating environments and devices including mobile devices such as smartphones and tablets. Our web-client provides the functionality, mobile device support and convenience they are demanding.

Of course the back-end of the product, the image and data repository, also comes in for major upgrades and improvements. We are sticking with MS SQL Server as our database but will incorporate a host of new features and improvements to better facilitate the handling of ‘big data’. We will continue to research and make improvements to the way we capture, store and retrieve data and because our customer’s databases are now so large (measured in hundreds of Gigabytes), we are making it easier and faster to both backup and audit the repository. The objectives as always are scalability, speed, security and robustness.

We are also adding new functionality to allow the customer to bypass our standard user interface (e.g., the .NET smart-client or web-client) and create their own user interface or presentation layer. The objective is to make it as easy as possible for the customer to create tailored interfaces for each operating unit within their organization. A simple way to think of this functionality is to imagine a single high level tool that lets you quickly and easily create your own screens and dashboards and program to our SDK.

On the add-in product front we will continue to invest in our add-in products such as the Button, the MINI API, the SDK, GEM, RecCapture, the High Speed Scanning Module and the SharePoint Integration Module. Even though the base product RecFind 6 has a full complement of enterprise content management functionality these add-on products provide options requested by our customers. They are generally a way to do things faster and more automatically.

We will continue to provide two approaches for document management; the end-user paradigm (RecFind 6 plus the Button) and the fully automatic capture and classification paradigm (RecFind 6 plus GEM and RecCapture). As has been the case, we also fully expect a lot of our customers to combine both paradigms in a hybrid solution.

The major architectural change is away from the .NET smart-client (fat-client) paradigm to the browser-based thin-client or web-client paradigm. We see this as the future for all application software, unconstrained by the strictures of proprietary operating systems like Microsoft Windows.

As always, our approach, our credo, is that we do all the hard work so you don’t have to. We provide the feature rich, scalable and robust image and data repository and we also provide all of the high level tools so you can configure your applications that access our repository. We also continue to invest in supporting and enhancing all of our products making sure that they have the feature set you require and run in the operating environments you require them to. We invest in the ongoing development of our products to protect your investment in our products. This is our responsibility and our contribution to our ongoing partnership.

 

Are you also confused by the term Enterprise Content Management?

by Frank 16. September 2012 06:00

I may be wrong but I think it was AIIM that first coined the phrase Enterprise Content Management to describe both our industry and our application solutions.

Whereas the term isn’t as nebulous as Knowledge Management it is nevertheless about as useful when trying to understand what organizations in this space actually do. At its simplest level it is a collective term for a number of related business applications like records management, document management, imaging, workflow, business process management, email management and archiving, digital asset management, web site content management, etc.

To simple people like me the more appropriate term or label would be Information Management but as I have already covered this in a previous Blog I won’t beleaguer the point in this one.

When trying to define what enterprise content management actually means or stands for we can discard the words ‘enterprise’ and ‘management’ as superfluous to our needs and just concentrate on the key word ‘content’. That is, we are talking about systems that in some way create and manage content.

So, what exactly is meant by the term ‘content’?

In the early days of content management discussions we classified content into two broad categories, structured and unstructured. Basically, structured content had named sections or labels and unstructured content did not. Generalising even further we can say that an email is an example of structured content because it has commonly named, standardised and accessible sections or labels like ‘Sender’, ‘Recipient’, ‘Subject’ etc., that we can interrogate and rely on to carry a particular class or type of information. The same general approach would regard a Word document as unstructured because the content of a Word document does not have commonly named and standardised sections or labels. Basically a Word document is an irregular collection of characters that you have to parse and examine to determine content.

Like Newtonian physics, the above generalisations do not apply to everything and can be argued until the cows come home. In truth, every document has an accessible structure of some kind. For example, a Word document has an author, a size, a date written, etc. It is just that it is far easier to find out who the recipient of an email was than the recipient of a Word document. This is because there is a common and standard ‘Tag’ that tells us who the recipient is of an email and there is no such common and standard tag for a Word document.

In our business we call ‘information about information’ (e.g., the recipient and date fields on an email) Metadata. If an object has recognizable Metadata then it is far easier to process than an object without recognizable Metadata. We may then say that adding Metadata to an object is the same as adding structure.

Adding structure is what we do when we create a Word document using a template or when we add tags to a Word document. We are normalizing the standard information we require in our business processes so the objects we deal with have the structure we require to easily and accurately identify and process them.

This is of course one of the long-standing problems in our industry, we spend far too much time and money trying to parse and interpret unstructured objects when we should be going back to the coal face and adding structure when the object is first created. This is of course relatively easy to do if we are creating the objects (e.g., a Word document) but not easy to achieve if we are receiving documents from foreign sources like our customers, our suppliers or the government. Unless you are the eight-hundred pound gorilla (like Walmart) it is very difficult to force your partners to add the structure you require to make processing as fast and as easy and as accurate as possible.

There have been attempts in the past to come up with common ‘standards’ that would have regulated document structure but none have been successful. The last one was when XML was the bright new kid on the block and the XML industry rushed headlong into defining XML standards for every conceivable industry to facilitate common structures and to make data transfer between different organizations as easy and as standard as possible. The various XML standardisation projects sucked up millions or even billions of dollars but did not produce the desired results; we are still spending billions of dollars each year parsing unstructured documents trying to determine content.

So, back to the original question, what exactly is Enterprise Content Management? The simple answer is that it is the business or process of extracting useful information from objects such as emails and PDFs and Word documents and then using that information in a business process. It is all about the process of capturing Metadata and content in the most accurate and expeditious manner possible so we can automate business processes as much as possible.

If done properly, it makes your job more pleasant and saves your organization money and it makes your customers and suppliers happier. As such it sounds a lot like motherhood (who is going to argue against it?) but it certainly isn’t like manna from heaven. There is always a cost and it is usually significant. As always, you reap what you sow and effort and cost produces rewards.

Is content management something you should consider? The answer is definitely yes with one proviso; please make sure that the benefits are greater than the cost.

 

Why isn’t Linux the universal desktop operating system?

by Frank 9. September 2012 06:00

I own and run a software company building enterprise content management solutions (RecFind 6) and I have a love/hate relationship with Microsoft Windows.

I love Windows because it is a universal platform I can develop for that provides me access to ninety-percent plus of the business and government organizations in the world.  I only need one set of source code and one set of development skills and I can leverage off this to offer my solutions to virtually any organization in any location. We may say that Microsoft Windows is ubiquitous.

I hate Windows because it is overly complex, unnecessarily difficult to build software for, buggy and causes me to have to spend far more money on software development than I ought to. There are many times each year when all I really want to do is assemble all the Microsoft programmers in one place and then bang their heads together and shout at them, “for heaven’s sake, why don’t you guys just talk to each other!”

Linux on the other hand, even in its many manifestations (one of its main problems), is not ubiquitous and it does not provide me with an entry point to ninety-percent of the world’s businesses and government agencies. This is why I don’t develop software for Linux.

Because I don’t develop application software for Linux I am not an expert in Linux but I have installed and run Ubuntu as a desktop operating system and I really like it. It is simple, clean and easy to use; more ‘Apple-like’ than ‘Windows-like’ to my eyes and all the better for it. It is also a great software development platform for programmers especially using the Eclipse IDE. It is also free and most of the office software you need (like OpenOffice) is also free. It also runs happily on virtually any PC or notebook and seems to be a lot faster than Windows.

So, Ubuntu (a flavour of Linux but a very good one) is free, most of the office software you need is also free, it looks good, runs on your hardware and is easy to use and uncomplicated. So why isn’t it ubiquitous? Why are people and organizations all over the world paying for (and struggling with – who remembers Vista?) inferior Windows when Linux varieties like Ubuntu are both free and better? Why are users and organizations now planning to pay to upgrade to Windows 7 or Windows 8 when alternative operating systems like Ubuntu will do the job and are free?

I read a lot of technical papers and IT blogs and I notice that the Linux community has been having similar discussions for years. As an ‘outsider’ (i.e., not a Linux zealot) it is pretty obvious to me that the Linux community is the main reason Linux is not ubiquitous. Please read the following ZDNet link and then tell me what you think.

http://www.zdnet.com/linus-torvalds-on-the-linux-desktops-popularity-problems-7000003641/

When I read an article like this two terms come immediately to mind, internecine bickering or sibling rivalry. How many versions of Linux do we need? The Linux fraternity calls these distributions or ‘distros’ to the insiders.  At last count there are around 600 ‘distros’ of which 300 are actively maintained.  Ubuntu is just one of these distros. How would the business world fare if there were 300 versions of Windows? Admittedly, most of the 300 have been built for a specialised use and the real list of general use versions of Linux is much smaller and includes product names such as Ubuntu, Kubuntu, Fedora, Mint, Debian, Arch, openSUSE, Red hat and about a dozen more.

But, it gets worse. On Ubuntu alone there are there main desktop environments to choose from, GNOME, KDE and Xfce.  Are you confused yet? Is it now obvious why Linux is not the default desktop operating system? It probably isn’t obvious to the squabbling Linux insider community but it is patently obvious to everyone else.

Linux isn’t the default desktop operating system because there is not a single standard and there is never likely to be a single standard. No software developer is going to invest millions of dollars in building commercial applications for Linux because of this. Without a huge library of software applications there is no commercial market for Linux. Windows reigns supreme despite its painful problems because it provides a single platform and because software developers do invest in building millions of commercial applications for the windows operating system.

Until such time as the Linux community stops its in-fighting and produces a single robust, supported version of Linux (when hell freezes over I hear you say) the situation will not change. The inferior desktop operating system Windows will continue to dominate and Linux will remain the plaything of propeller-heads and techies and old guys like me who really like it (well, the Ubuntu version that is, there are too many distros for me to become an expert in all of them and that is the core of the problem).

Is Information Management now back in focus?

by Frank 12. August 2012 06:00

When we were all learning about what used to be called Data Processing we also learned about the hierarchy or transformation of information. That is, “data to information to knowledge to wisdom.”

Unfortunately, as information management is part of what we call the Information Technology industry (IT) we as a group are never satisfied with simple self-explanatory terms. Because of this age-old flaw we continue to invent and hype new terms like Knowledge Management and Enterprise Content Management most of which are so vague and ill-defined as to be virtually meaningless but nevertheless, provide great scope for marketing hype and consultants’ income.

Because of the ongoing creation of new terminology and the accompanying acronyms we have managed to confuse almost everyone. Personally I have always favoured the term ‘information management’ because it tells it like it is and it needs little further explanation. In the parlance of the common man it is an “old un, but a good un.”

The thing I most disliked about the muddy knowledge management term was the claim that computers and software could produce knowledge. That may well come in the age of cyborgs and true artificial intelligence but I haven’t seen it yet. At best, computers and software produce information which human beings can convert to knowledge via a unique human cognitive process.

I am fortunate in that I have been designing and programming information management solutions for a very long time so I have witnessed first-hand the enormous improvements in technology and tools that have occurred over time. Basically this means I am able to design and build an infinitely better information management solution today that I could have twenty-nine years ago when I started this business.  For example, the current product RecFind 6 is a much better, more flexible, more feature rich and more scalable product than the previous K1 product and it in turn was an infinitely better product than the previous one called RecFind 5.

One of the main factors in them being better products than their predecessors is that each time we started afresh with the latest technology; we didn’t build on the old product, we discarded it completely and started anew. As a general rule of thumb I believe that software developers need to do this around a five year cycle. Going past the five year life cycle inevitably means you end up compromising the design because of the need to support old technology. You are carrying ‘baggage’ and it is synonymous with trying to run the marathon with a hundred pound (45 Kg) backpack.

I recently re-read an old 1995 white paper I wrote on the future of information management software which I titled “Document Management, Records Management, Image Management Workflow Management...What? – The I.D.E.A”. I realised after reading this old paper that it is only now that I am getting close to achieving my lofty ambitions as espoused in the early paper. It is only now that I have access to the technology required to achieve my design ambitions. In fact I now believe that despite its 1995 heritage this is a paper every aspiring information management solution creator should reference because we are all still trying to achieve the ideal ‘It Does Everything Application’ (but remember that it was my I.D.E.A. first).

Of course, if you are involved in software development then you realise that your job is never done. There are always new features to add and there are always new releases of products like Windows and SQL server to test and certify against and there are always new releases of development tools like Visual Studio and HTML5 to learn and start using.

You also realise that software development is probably the dumbest business in the world to be part of with the exception of drug development, the only other business I can think of which has a longer timeframe between beginning R&D and earning a dollar. We typically spend millions of dollars and two to three years to bring a brand new product to market. Luckily, we still have the existing product to sell and fund the R&D. Start-ups however, don’t have this option and must rely on mortgaging the house or generous friends and relatives or venture capital companies to fund the initial development cycle.

Whatever the source of funding, from my experience it takes a brave man or woman to enter into a process where the first few years are all cost and no revenue. You have to believe in your vision, your dream and you have to be prepared for hard times and compromises and failed partnerships. Software development is not for the faint hearted.

When I wrote that white paper on the I.D.E.A. (the It Does Every Thing Application or, my ‘idea’ or vision at that time) I really thought that I was going to build it in the next few years, I didn’t think it would take another fifteen years. Of course, I am now working on the next release of RecFind so it is actually more than fifteen years.

Happily, I now market RecFind 6 as an information management solution because information management is definitely back in vogue. Hopefully, everyone understands what it means. If they don’t, I guess that I will just have to write more white papers and Blogs.

Have we really thought about disaster recovery?

by Frank 29. July 2012 06:00

The greatest knowledge-loss disaster I can think of was the destruction of the great library of Alexandria by fire around 642 AD. This was the world’s largest and most complete store of knowledge at the time and it was almost totally destroyed. It would take over a thousand years for mankind to rediscover and regain the knowledge that went up in smoke and to this day we still don’t think we have recovered or re-discovered a lot of what was lost. It was an unmitigated disaster for mankind because nearly all of Alexandria’s records were flammable and most were irreplaceable.

By contrast, we still have far older records from ancient peoples like the Egyptians of five-thousand years ago because they carved their records in stone, a far more durable material.

How durable and protected are your vital records?

I mentioned vital records because disaster recovery is really all about protecting your vital records.  If you are a business a vital record is any record without which your business could not run. For the rest of us a vital record is irreplaceable knowledge or memories. I bet the first thing you grab when fire or flood threatens your home is the family photo album or, in this day and age, the home computer or iPad or backup drive.

In 1996 I presented a paper to the records management society titled “Using technology as a surrogate for managing and capturing vital paper based records.” The technology references are now both quaint and out-of-date but the message is still valid. You need to use the most appropriate technology and processes to protect your vital records.

Interestingly, the challenges today are far greater than they were in 1996 because of the ubiquitous ‘Cloud’.  If you are using Google Docs or Office 365 or even Apple iCloud who do you think is protecting your vital records? Have you heard the term ‘outage’? Would you leave your children with a stranger, especially a stranger who doesn’t even tell you the physical location of your children? A stranger who is liable to say, “Sorry, it appears that your children are missing but under our agreement I accept no liability.” Have you ever read the standard terms and conditions of your Cloud provider? What are your rights if your vital records just disappear? Where are your children right now?

Some challenges are surprisingly no different because we are still producing a large proportion of our vital records in paper. Apart from its major flaws of being highly flammable and subject to water damage paper is in fact an excellent medium for the long term preservation of vital records because we don’t need technology to read it; we may say paper is technology agnostic.

By contrast, all forms of electronic or optical storage are strictly technology dependent. What good is that ten year old DAT tape if you no longer have the Pentium compute, SCSI card, cable and Windows 95 drivers to read it? Have you moved your vital records to new technology lately?

And now to the old bugbear (a persistent problem or source of annoyance), a backup is not disaster recovery. If your IT manager tells you that you are OK because he takes backups you should smack him with your heaviest notebook, (not the iPad, the iPad is too light and definitely not with the Samsung tablet, it is too fragile).

I have written about what disaster recovery really involves and described our disaster recovery services so I won’t repeat it here, I have just provided the link so you can read at your leisure.

Suffice to say, the objective of any disaster recovery process is to ensure that you can keep running your business or life with only a minimal disruption regardless of the type or scale of the disaster.

I am willing to bet that ninety-percent of homes and businesses are unprepared and cannot in any way guarantee that they could continue to run their business or home after a major disaster.

We don’t need to look as far back as 642 AD and the Alexandria Library fire for pertinent examples. How about the tsunami in Japan in 2011? Over 200,000 homes totally destroyed and countless business premises wiped from the face of the earth. Tsunamis, earthquakes, floods, fire and wars are all very real dangers no matter where you live.

However, it isn’t just natural disasters you need to be wary of. A recent study published by EMC Corporation offers a look at how companies in Japan and Asia Pacific deal with disaster recovery. According to the study, the top three causes of data loss and downtime are hardware failure (60%), data corruption (47%), and loss of power (44%).

The study also goes on to analyse how companies are managing backups and concludes, “For all the differences inherent to how countries in the Asia Pacific region deal with their data, there is at least one similarity with the rest of the world: Companies are faced with an increasing amount of data to move within the same backup windows. Many businesses in the region, though, still rely on tape backup systems (38%) or CD-ROMs (38%). On this front, the study found that many businesses (53%) have plans to migrate from tape to a faster medium in order to improve the efficiencies of their data backup and recovery.”

It concludes by estimating where backups are actually stored, “The predominant response is to store offsite data at another company-owned location within the same country (58%), which is followed by at a “third-party site” within the same country.”

I certainly wouldn’t be relying on tape as my only recovery medium and neither would I be relying on data and systems stored at the same site or at an employee’s house. Duplication and separation are the two key principles together with proven and regularly tested processes.

I recently spoke to an IT manager who wasn’t sure what his backup (we didn’t get to disaster recovery) processes were. That was bad enough but when he found out it seemed that they took a full backup once a month and then incremental backups every day and he had not tested the recovery process in years. I sincerely hope that he has somewhere to run and hide when and if his company ever suffers a disaster.

In a nutshell, disaster recovery is all about being able to get up and running again in as short a time as possible even if your building burns to the ground. That in fact is the acid test of any disaster recovery plan. That is, ask your IT manager, “If this building burns down Thursday explain to me how we will be up and operating again on Friday morning.”

If his answer doesn’t fill you with confidence then you do not have a disaster recovery plan.

 

Business Processes Management, BPM, BPO; just what does it entail?

by Frank 15. July 2012 06:00

Like me I am sure that you have been inundated with ads, articles, white papers and proposals for something called BPM or BPO, Business Process Management, Business Process Outsourcing and Business Process Optimisation.

Do you really understand what it all means?

BPM and BPO certainly aren’t new, there have been many companies offering innovative and often cutting-edge technology solutions for many years. The pioneering days were probably the early 1980’s. One early innovator I can recall (and admired) was Tower Technology because their office was just across from our old offices in Lane Cove.

In the early days BPM was all about imaging and workflow and forms. Vendors like Tower Technology used early version of workflow products like Staffware and a whole assortment of different imaging and forms products to solve customer processing problems. It involved a lot of inventing and a lot of creative genius to make all those disparate products work and actually do what the sales person promised. More often than not the final solution didn’t quite work as promised and it always seemed to cost a lot more than quoted.

Like all new technologies everyone had to go through a learning process and like most new technologies, for many years the promises were far ahead of what was actually delivered.

So, is it any different today? Is BPM a proven, reliable and feature-rich and mature technology?

The answer dear friends is yes and no; just as it was twenty-five or more years ago.

There is a wonderful Latin phrase ‘Caveat Emptor’ which means “Let the buyer beware”. Caveat Emptor applies just as much today as it did in the early days because despite the enormous technological progress we have all witnessed and experienced we are still pushing the envelope. We are still being asked to do things the current software and hardware can’t quite yet handle. The behind the scenes technicians are still trying to make the product do what the sales person promised in good faith (we hope) because he didn’t really understand his product set.

Caveat Emptor means it is up to the buyer to evaluate the offering and decide if it can do the job. Of course, if the vendor lies or makes blatant false claims then Caveat Emptor no longer applies and you can hit them with a lawsuit.  However, in reality it is rarely as black and white as that. The technology is complex and the proposals and explanations are full of proprietary terminology, ambiguities, acronyms and weaselly words.

Like most agreements in life you shouldn’t enter into a BPM contract unless you know exactly what you are getting into. This is especially true with BPM or BPO because you are talking about handing over part of your core business processes to someone else to ‘improve’. If you don’t understand what is being proposed then please hire someone who does; I guarantee it will be worth the investment. This is especially true if you are outsourcing customer or supplier facing processes like accounts payable and accounts receivable. Better to spend a little more up front than suffer cost overruns, failed processes and an inbox full of complaints.

My advice is to always begin with some form of a consultancy to ‘examine’ your processes and produce a report containing conclusions and recommendations. The vendor may (should) offer this as part of its sales process and it may be free or it may be chargeable.  Personally, I believe in the old adage that you get what you pay for so I would prefer to pay to have a qualified and experienced professional consultant do the study. The advantage of paying for the study is that you then ‘own’ the report and can then legally provide it to other vendors to obtain competitive quotes.

You should also have a pretty good idea of what the current processing is costing you in both direct and indirect costs (e.g., lost sales, dissatisfied customers, unhappy staff, etc.) before beginning the evaluation exercise. Otherwise, how are you going to be able to judge the added value of the vendor’s proposal?

In my experience the most common set of processes to be ‘outsourced’ are those to do with accounts payable processing. This is the automation of all processes beginning with your purchase order (and its line items), the delivery docket (proof of receipt), invoices (and line items) and statements. The automation should reconcile invoices to delivery dockets and purchase orders and should throw up any discrepancies such as items invoiced but not delivered, variations in price, etc. Vendors will usually propose what is commonly called an automatic matching engine; the software that reads all the documents and does its best to make sure you only pay for delivered goods that are exactly as ordered.

If the vendor’s proposal is to be attractive it must replace your manual processing with an automated model that is faster and more accurate. Ideally, it would also be more cost-effective but even if it is more costly than your manual direct cost estimate it should still solve most of your indirect cost problems like unhappy suppliers and late payment fees.

In essence, there is nothing magical about BPM and BPO; it is all about replacing inefficient manual processes with much more efficient automated ones using clever computer software. The magic, if that is the word to use, is about getting it right. You need to know what the current manual processing is costing you. You need to be absolutely sure that you fully understand the vendor’s proposal and you need to build in metrics so you can accurately evaluate the finished product and clearly determine if it is meeting its stated objectives.

Please don’t enter into negotiations thinking that if it doesn’t work you can just blame the vendor. That would be akin to cutting off your nose to spite your face. Remember Caveat Emptor; success or failure really depends upon how well you do your job as the customer.

Month List