The secret to increased productivity

by Frank 4. August 2015 06:41

By nature I am and have always been a sequencer and an overlapper. It comes naturally to me. It is how I process everyday events. For example, in the morning when making a pot of tea I first fill up the jug and turn it on to boil before emptying and cleaning the teapot. This is because I want the two tasks to overlap for maximum efficiency. If I emptied and cleaned the teapot first before filling up the jug and turning it on, the elapsed time required to make a pot of tea would be longer and therefore inefficient. With my method I save time because the total elapsed time to make a pot of tea is how long it takes to fill the jug and get it to boil. I correctly sequence and overlap the two events to be more productive. It also helps me to get to work on time.

Here is another simple example. Have you ever been in a restaurant and watched with frustration as the waiter brought out meals and then returned to the kitchen without picking up your dirty dishes? Then watch in frustration again as the waiter comes out to pick up dirty dishes but leaves someone’s lunch at the kitchen counter getting cold? Why doesn’t the waiter pick up dirty dishes, or take your order, on the way back to the kitchen? Life, business and government is full of such everyday examples of non-overlapping, poorly sequenced processes all resulting in lower productivity and higher costs for everyone.

The worst example of all is when employees are allowed to tightly redefine their jobs concentrating more on “this is what I don’t do” instead of “this is what I do”. For these employees, the terms ‘multi-skilling’ and ‘multi-tasking’ are anathema. I envision them standing within a tiny, tight circle where anything outside of that circle is not their responsibility. We may as well brick them up inside a chimney. These are not the kind of employees or practices I want in my business or our public service or our government for that matter. Unfortunately, these are exactly the kind of ant-productivity practices we find throughout our public sector and our government. As most of us are already more than well aware, the problem is more than endemic; it is systemic and probably not fixable short of a revolution. It is no secret why our taxes are so high and getting higher all the time.

Many years ago when I was a trainee programmer I learnt all about overlap while being trained at IBM. The patient instructor made the point that computers only seem to do multiple things at the same time. In fact, the architecture of computer processing at that time meant a computer could only process one command at a time but in making use of overlap and time-sharing it appeared as if it was doing many things at once. For example, the IBM 360 processor would issue an I/O command to a channel to go off and read a record from a disk drive. Relatively speaking, this took an enormity of time because disks were so slow compared to the CPU. So instead of waiting for the channel to complete the I/O request the processor would process other work all the time waiting for the channel to interrupt it and say “I am finished, here is the data you asked for”. So the computer appeared to be doing multiple tasks at once because it correctly sequenced the tasks it had to perform and took full advantage of overlap. Therein lies a lesson for all of us.

When faced with a list of tasks to perform first think about the opportunities for overlap. Then sequence the tasks to take maximum advantage of overlap.  

All it requires is the desire to work smarter, a little thought and a sense of pleasure in making best use of the limited time life allows us all.

In my role as a designer of computer software I always try to take advantage of sequencing and overlap. In my business, the two terms most used when implementing this approach are asynchronous events and multi-threading. These two techniques should always be applied when a list of tasks to be performed is not sequential. That is, they don’t have to be completed one after the other in a strict sequence. We take advantage of the fact that some tasks are independent and therefore can be processed at the same time we process other tasks. We do this in various ways but usually by defining them as asynchronous events and by utilizing a form of multi-tasking or multi-threading (starting two or more events at the same time). Computers aren’t smart (at least not yet) and they rely totally on human programmers to make them behave in an efficient and ‘smart’ way. Computer programmers who don’t understand sequencing and overlap can write very bad and very slow programs even to the extent of making very fast computers look very slow. Then, they waste everyone’s time and become major contributors to the anti-productivity movement.

There is an enormous amount of money being invested today in the science of longevity; in trying to find ways to make it possible for people to live longer lives. When the solution becomes available it won’t be cheap and it won’t be available to ordinary people like you and me. It will initially only be available to the elite and to the very rich. However, don’t despair; there is a low-cost way to double the amount of time you have to enjoy life. An easy and available now way to double your life span.

All you have to do is be aware of the possibilities of sequencing and overlap in your life and then work to take advantage of them. If you reduce the amount of time you take to do ‘work’ every day by fifty, forty, thirty or even twenty-percent you are adding years to the time you have to live and enjoy life. It is the easiest and lowest cost way to increase your effective life span.

For example, don’t try to impress your boss by working longer hours; arriving first and leaving last (as my generation did). Instead, impress your boss with a proposal whereby you do more work in fewer hours. You of course need to quantify your proposal and add in some metrics so your increased productivity can be measured and proven.

Please don’t waste your time and your effective life span by pondering ways to avoid work; instead, utilize those same cognitive processes to work out how to complete your assigned work in the fastest way possible. Approach every project looking for ways to better sequence tasks and take advantage of overlap. Make it a game; enjoy it.

I was once told that the average pattern of a human life is eight-hours work, eight-hours sleep and eight-hours play. Of course, with commuting, it is really now more like eight-hours sleep, ten-hours work and six-hours play. Let’s try and double those play hours.

As I am fond of saying, it isn’t rocket science. It is just common sense, a very simple and achievable way to significantly increase your effective life span; the time available to you to enjoy life. Give yourself twice as much time to enjoy life and in doing so, live twice as long. 

Increased productivity doesn’t just provide benefits to the economy; it can also provide very substantial personal benefits. Why don’t you give it a try?

The absolute easiest & lowest cost way to meet all Electronic Document & Records Management (EDRMS) requirements?

by Frank 19. May 2015 06:00

Because we are a software vendor that builds and markets a range of Enterprise Content Management tools under the RecFind 6 banner I have often been asked, “What is the absolute easiest and lowest cost way to meet all compliance requirements?”

I usually respond with a well-considered and ‘traditional’ response that includes information about Business Classification Systems, UI design, Retention Schedules, etc., etc. The solution proposed would also require a significant degree of consulting so that we aware entirely conversant with the customer’s requirements and business practices and also involve a significant amount of end user training.

This is what the customer expects and it falls in line with the traditional, professional approach.

However, the final solution is rarely ‘easy’ or ‘low-cost’ primarily because it has followed the traditional approach. The more we ask questions and consult and the more people we speak to the more complex the solution becomes. This is normal because we end up trying to configure a solution to meet hundreds or thousands of variables.

There is an easier and lower cost way but I fear that very few of my customers would ever consider it because it requires them to disregard everything they have ever learned about rolling out an EDRMS. We have tried proposing it a few times but never with success. It usually gets shot down by the external consultant or the internal records management professional or both.

It doesn’t require a BCS or a Taxonomy and it doesn’t require a complex Retention Schedule and it doesn’t require significant consulting or significant end-user training. Records Management professionals will surely hate it as a ‘career-ending’ trend. It does require an open mind, the ability to think laterally and a willingness to redefine both the problem and the solution.

It only has three requirements:

  1. Know what electronic documents and emails you don’t want to capture;
  2. Provide a powerful but easy-to-use search that allows anyone to find anything instantly; and
  3. Employ a risk-management approach to retention and select a single retention date (e.g., 7 or 20 years).

Fundamental to the success of this non-conformist solution is the acceptance that computers and storage are dirt-cheap compared to human time. If your IT manager or CIO still agonizes and complains about how much disk space you use up for emails and electronic documents then this is definitely not the solution for you. Your IT hierarchy is still living in the long-gone past when computers and disk were expensive and people were cheap (by comparison).

However, if you have practical, sensible IT people then the approach is worth considering especially if your organization has a long history of failing to digitize its records and automate its processes. That is, you have tried at least once to roll out an organization-wide EDRMS and have failed and/or blown the budget. The word ‘pilot’ probably appears often in your company history usually prefixed by the adjective ‘failed’. Don’t feel too bad, most pilots are initiated because management lacks conviction. They are therefore destined to fail.

We have the tools required to implement such a solution but I won’t go into detail about them now. This is a concept paper, not a detailed instruction manual. If you are interested in the concept please contact me and I can then elaborate.

So, if you really do want to rollout a successful EDRMS and do it in the fastest and least disruptive and lowest cost way possible then please write to me and pose your questions.

For the doubters, this is the same way we manage our electronic documents and emails at Knowledgeone Corporation and we have done so for many years. We use our own software; apart from a couple of accounting packages we run our whole company with the RecFind 6 Product Suite and totally automate the capture of all electronic documents and emails. All my staff have to know is how to search and yes, they can find anything in seconds even after 31 years of operation and a very, very large database.

It is not difficult, it is not ‘expensive’, it does not require a huge amount of management or maintenance time and it runs largely in the background. As I said above, all your staff have to learn is how to search.

It does however, require an open mind and a desire to finally solve the problem in the most expeditious manner possible. But, please don’t tell me you want to run a pilot. Test my solution by all means and put it through the most vigorous change control procedures but don’t damn the end result by beginning with a “we are not really sure it will work so are not really committed and won’t allocate much of a budget but let’s try a pilot anyway because that limits our exposure and risk” approach.

I don’t want to waste your time or mine.

How to simplify electronic document and email management

by Frank 17. September 2014 06:00

I have written about this topic many times in the past (see links at the end of this post) but the lesson is always the same. There are two key rules:

1.     If your system relies on people being 100% consistent and reliable it won’t work; and

2.     If you system places an additional workload on already busy workers it won’t work.

The message is, if you simplify and automate your system you give it the best possible chance of working.

If your system works as automatically as possible and doesn’t require much effort from your workforce then it has the best possible chance of being successful.

With today’s technology and tools there is simply no need to burden your workforce with capture and classification tasks. Do you still see people still using typewriters, rotary phones or Morse code? No you don’t because there is much better technology available. So why do you persist with an old, outdated and unsuccessful model? Why do you ask your staff to manually capture and classify electronic documents and emails when there are much better, much faster, much more consistent and much more reliable ways to achieve a better result? It is after all 2014, not 1914; we all use computers and smart phones now, not typewriters, wind-up rotary phones and Morse code.

Emails are managed by email servers, (yes, even Google). Email servers allow plug-ins and add-ons and are ‘open’ so you can automatically monitor and capture incoming and outgoing emails.

Electronic documents are always saved somewhere, for example on your shared drives or directly into your DMS. As such they can be captured and interrogated programmatically.

It is entirely possible to ‘parse’ any electronic document or email and its associated attributes and Metadata and make consistent decisions about whether or not to capture it and how to classify it when captured. It isn’t rocket science any more, it is just analysis, design and programming. We can go even further and determine who should be notified and what action(s) need to be initiated in response to each new email or electronic document.  

We can easily implement an end-to-end business process whereby every electronic document and email is managed from creation to destruction and we can do this with minimal human involvement. Where human involvement is required, for example making a decision or deciding upon an appropriate response, we can also automate and manage the business processes required and simply ‘present’ staff with all the required information when required.

Isn’t this was the Knowledge Management revolution was supposed to be about?

“A system that provides the user with the explicit information required, in exactly the form required at precisely the time the user needs it.”

The new model is all about automation and processing at the server rather than at the user’s workstation; a fully automatic, server-centric paradigm. A system that is all about the ‘Push’ rather than the ‘Pull’ model. A model whereby the computer services the end user, where the end user is not a slave to the computer.

We could also call it management by exception. “Please only give me what I need to see when I need to see it.”

None of the above is new or revolutionary thinking, it is all just common sense. None of the above requires yet-to-be invented technology or products, it only requires existing and proven technology and products.

The fully-automatic, server-centric approach should be the default choice and it should be a no-brainer for any organization that needs to implement an email and document management regime. Unfortunately, too often it isn’t.

If you have the responsibility of rolling out an email and document management system and the fully-automatic, server-centric approach isn’t on your agenda then your boss should be asking you why not.

References:

White papers

Posts

How to clean up your shared drives, Frank’s approach

by Frank 22. August 2014 06:00

In my time in this business (enterprise content management, records management, document management, etc.) I have been asked to help with a ‘shared drive problem’ more times than I can remember. This particular issue is analogous with the paperless office problem. Thirty years ago when I started my company I naively thought that both problems would be long gone by now but they are not.

I still get requests for purely physical records management solutions and I still get requests to assist customers in sorting out their shared drives problems.

The tools and procedures to solve both problems have been around for a long time but for whatever reason (I suspect lack of management focus) the problems still persist and could be described as systemic across most industry segments.

Yes, I know that you can implement an electronic document and records management system (we have one called RecFind 6) and take away the need for shared drives and physical records management systems completely but most organizations don’t and most organizations still struggle with shared drives and physical records. This post addresses the reality.

Unfortunately, the most important ingredient in any solution is ‘ownership’ and that is as hard to find as it ever was. Someone with authority, or someone who is prepared to assume authority, needs to take ownership of the problem in a benevolent dictator way and just steam-roll a solution through the enterprise. It isn’t solvable by committees and it requires a committed, driven person to make it happen. These kind of people are in short supply so if you don’t have one, bring one in.

In a nutshell there are three basic problems apart from ownership of the problem.

1.     How to delete all redundant information;

2.     How to structure the ‘new’ shared drives; and

3.     How to make the new system work to most people’s satisfaction.

Deleting redundant Information

Rule number one is don’t ever ask staff to delete the information they regard as redundant. It will never happen. Instead, tell staff that you will delete all documents in your shared drives with a created or last updated date greater than a nominated date (say one-year into the past) unless they tell you specifically which ‘older’ documents they need to retain. Just saying “all of them” is not an acceptable response. Give staff advance notice of a month and then delete everything that has not been nominated as important enough to retain.  Of course, take a backup of everything before you delete, just in case. This is tough love, not stupidity.

Structuring the new shared drives

If your records manager insists on using your already overly complex, hierarchical corporate classification scheme or taxonomy as the model for the new shared drive structure politely ask them to look for another job. Do you want this to work or not?

Records managers and archivists and librarians (and scientists) understand and love complex classification systems. However, end users don’t understand them, don’t like them and won’t use them. End users have no wish to become part-time records managers, they have their own work to do thank you.

By all means make the new structure a subset of the classification system, major headings only and no more than two levels if possible. If it takes longer than a few seconds to decide where to save something or to find something then it is too complex. If three people save the same document in three different places then it is too complex. If a senior manager can’t find something instantly then it is too complex. The staff aren’t to blame, you are.

I have written about this issue previously and you can reference a white paper at this link, “Do you really need a Taxonomy?”

The shared drives aren’t where we classify documents, it is where we make it as easy and as fast as possible to save, retrieve and work on documents; no more, no less. Proper classification (if I can use that term) happens later when you use intelligent software to automatically capture, analyse and store documents in your document management system.

Please note, shared drives are not a document management system and a document management system should never just be a copy of your shared drives. They have different jobs to do.

Making the new system work

Let’s fall back on one of the oldest acronyms in business, KISS, “Keep It Simple Stupid!” Simple is good and elegant, complex is bad and unfathomable.

Testing is a good example of where the KISS principle must be applied. Asking all staff to participate in the testing process may be diplomatic but it is also suicidal. You need to select your testers. You need to pick a small number of smart people from all levels of your organization. Don’t ask for volunteers, you will get the wrong people applying. Do you want participants who are committed to the system working, or those who are committed to it failing? Do you want this to succeed or not?

If I am pressed for time I use what I call the straight-line-method. Imagine all staff in a straight line from the most junior to the most senior. Select from both ends, the most junior and the most senior. Chances are that if the system works for this subset that it will also work for all the staff in between.

Make it clear to all that the shared drives are not your document management system. The shared drives are there for ease of access and to work on documents. The document management system has business rules to ensure that you have inviolate copies of important documents plus all relevant contextual information. The document management system is where you apply business rules and workflow. The document management system is all about business process management and compliance. The shared drives and the document management system are related and integrated but they have different jobs to do.

We have shared drives so staff don’t work on documents on ‘private’ drives, inaccessible and invisible to others. We provide a shared drive resource so staff can collaborate and share information and easily work on documents. We have shared drives so that when someone leaves we still have all their documents and work-in-process.

Please do all the complex processes required in your document management system using intelligent software, automate as much as possible. Productivity gains come about when you take work off staff, not when you load them up with more work. Give your staff as much time as possible so they can use their expertise to do the core job they were hired for.

If you don’t force extra work on your staff and if you make it as easy and as fast as possible to use the shared drives then your system will work. Do the opposite and I guarantee it will not work.

A simple guide to using shared drives to capture & classify electronic documents and emails

by Frank 18. July 2014 06:00

I have written previously about ways to solve the shared drives problem (click here) and I have written numerous articles (and a book) about ways to manage emails and electronic/digital records. However, we still receive multiple requests from customers and prospective customers about the best, and simplest, way to effectively manage these problems.

The biggest stumbling block and impediment to progress in most cases is the issue of a suitable taxonomy or classification system. Time and time again I see people putting off the solution while they spend years and tens of thousands or hundreds of thousands of dollars grappling with the construction of a suitable taxonomy. I have written about this topic previously as well and if you want my recommendations please click on this link.

If you really want the simplest, easiest to understand, easiest to use and lowest cost way to solve all of the above problems then please forget about spending the next twelve to eighteen months grappling with the nuances of your classification system. It isn’t necessary.

What you need instead is a natural classification structure that reflects your business processes. Please give your long-suffering end users something they will instantly recognize and can easily work with because it is familiar from their day to day work. Give them something to work with that doesn’t require them to become amateur records managers battling to decipher a complex, hierarchical classification system that requires an intricate knowledge of classification theory to interpret correctly. Give them something that makes it as easy as possible to file everything in the right place first time with absolutely minimal effort. Give them something that makes it as easy as possible to find something.

What I am proposing isn’t a hundred-percent solution and it won’t suit every organization but I guarantee that it will turn chaos into order in any organization that implements it. You may well see it as an eighty-five-percent solution but that is a hell of a lot better than no solution. It is also easy and fast to implement and relatively low cost (you will need some form of RM software).

First up you need to make decisions about what kind of business you are.  Notice that I said “what kind of business you are” not “what kind of records you manage” or “how your business is structured”.  Most importantly, strongly resist the temptation to base your classification structure on your existing business structure or organization’s departments/agencies and instead base it on your most common business processes. Please refer to the following extract from:

Overview of Classification Tools for Records Management by the National Archives of Australia, ISBN 0 642 34499 X (an excellent reference document if you need to understand classification systems).

“Classifying records and business information by functions and activities moves away from traditional classification based on organisational structure or subject. Functions and activities provide a more stable framework for classification than organisational structures that are often subject to change through amalgamation, devolution and decentralisation. The structure of an organisation may change many times, but the functions an organisation carries out usually remain much the same over time.”

I would also strongly resist the temptation to build your classification structure on content; it is way too difficult. Instead, as I have said above, base it on your common business processes.

When I say classification structure I mean the way you name and organize folders in your shared drives. I can’t give you a generic solution because I am not that clever; I don’t know enough about your business. I can however, give you an example.

Please also remember that for the most part, we are dealing with unstructured source information; Word, Excel, PowerPoint, Emails, etc. Emails are a little easier to deal with because they have a limited but common structure, e.g., Date Received, Sender, Recipient, CC and Subject. With other electronic documents we are have far less information and are  usually limited to Author (not reliable), Date Created, Date Modified and Filename. Ergo, as I said earlier, trying to base a classification system on the content of unstructured documents is both difficult and inexact. It is certainly doable but you will have to spend a lot more money on consulting and sophisticated software to achieve your ends.

In my simple example of my simple system I am going to assume that your business is customer (or client) centric, i.e., as opposed to being case-centric or project-centric, etc. The top level of your classification structure therefore will be the client name and/or number. To make it as simple as possible I am going to propose only two levels. The second level represents your most common business processes, that is, what you do with each customer. So for example, I have:

Customer Name

     Correspondence

     Contracts

     Quotes & Proposals.

     Orders

     Incidents

I am also not going to differentiate between emails and other types of electronic documents, I am going to treat them all the same.

Now how does this simple system work?

  1. Staff producing electronic documents don’t have their ‘own’ shared drive, all staff use the common classification structure. This is very important, let one or more people be exceptions and you no longer have a system you can rely on to meet your needs for reliable retrieval and any compliance legislation you are subject to.
  2. Staff drag and drop or ‘save-as’ emails from their email client to the correct sub-folder.
  3. Similarly, staff save (or drag and drop) electronic documents into the correct sub-folder. You can control access if required by applying security to electronic documents.
  4. You purchase or build a document repository (based on any common database such as SQL Server, MySQL, etc.) and within this repository you replicate the folder structure of your shared drives with logical folders and subfolders.
  5. You purchase or build a tool that constantly monitors the shared drives (e.g., using .NET Watcher technology) and that instantly captures a copy of any new or modified document (you do need to configure your repository to automatically version modified documents). You may also decide to automatically delete the original source document after it has been captured.
  6. You build or purchase a records and document management software package that allows you to index, search and report on all the information in your repository.
  7. You train your staff in how to save and search for information (shouldn’t take more than a half to one day) and then you go live.

I would also recommend applying a retention schedule based on sub folder (e.g., contracts) and date created and have the records management system automatically apply it to manage the lifecycle of captured documents. There is no sense in retaining information longer than you have to; it is also a dangerous practice.

Please note that the above is just an example and a very simple one at that. You need to determine the most appropriate folder structure for your organization.

WARNING

Do not let the folder structure become overly complex and unwieldy. If you do, it won’t work and you will end up with lots of stuff either not captured or captured to the wrong place. The basic rules are that if it takes more than few second to decide where to file something then it is too complex and that any structure more than 3 levels deep is too complex.

And finally, this isn’t just a theory, it is something we do in our organization and it is something many of our customers do. If you would like to read more on this approach there are some white papers and more explanations at this link. Alternatively, you can contact us and ask questions at this link.

Good luck.

 

Are you still struggling with physical records management, with paper?

by Frank 16. July 2014 00:01

 

Are you still struggling with physical records management, with paper?

We produced our first computerised records management system in 1984 (when our company was called GMB) and it was called DocFind. It was marketed by the Burroughs Corporation initially to about 100 clients and then we stared marketing DocFind direct and sold it to about another 2,000 clients.

Every one of those clients wanted DocFind just to manage physical records, paper, file folders and archive boxes. There was little or no demand for document imaging and workflow and the term electronic document management had yet to be invented. Office automation was in its infancy. We for example, wrote our letters on an Apple IIe using a word processor called WordStar running under CP/M.

In 1986 we released RecFind, a major remake of the DocFind product. This product was initially marketed by ourselves and NEC and it too focussed just on managing physical records.

However, even in 1986 we knew we had a bigger job to do with the general acceptance of document scanners and workflow so we added imaging and workflow to our product and starting trying to convince our customers and prospective customers to reduce the size of their paper mountain and even to start planning for a ‘Paperless Office’.

In the late 1980s and early 1990s I delivered numerous papers extolling the value of the paperless office and worked hard to convince my customers to make the move to Electronic Document and Records Management (EDRMS).

In the mid-1990s the industry discovered ‘Knowledge Management’ (KMS) and industry consultants lost interest in EDRMS and instead heavily promoted the virtues and benefits of KMS, whatever it was. Maybe this was the time organizations lost interest in eradicating paper as senior IT staff and consultants moved on to more interesting projects like KMS.

In 1995 I delivered my first paper on a totally integrated information management system or what I called at the time the ‘It Does Everything Application’ (IDEA). In 1995 I truly thought the age of physical records management was almost over and that the western world at least would move to fully-automated, paperless processes.

How wrong I was 19 years ago.

Today, despite the advanced functionality of our RecFind 6 Product Suite, almost all of my customers still manage physical records with RecFind 6. At least half of the inquiries that come in via our website are for systems to manage physical records.

There is more paper in the world today than there has ever been and organizations all over the world still struggle with managing paper, vast amounts of paper.

Luckily for us, we never succumbed to the temptation to remove the paper handling features from our products. Instead, we added to them with each subsequent release and redesign/rewrite of RecFind. We had to provide upwards compatibility for our clients as they still managed mountains of paper both onsite and offsite.

Being a little older and wiser now I am never again going to predict the paperless office. I will provide advanced physical records management functionality for my clients as long as they require it.

I haven’t given up the fight but my job is to address the real needs of my customers and they tell me and keep telling me that they need to manager paper, mainly file folders full of paper and archive boxes full of file folders. They need to manage paper onsite in shelving and offsite in warehouses with millions of boxes and we do it all.

We manage paper from creation to destruction and throughout the whole lifecycle. We apply retention schedules and classification systems and we track anything and everything with barcodes and barcode readers. We have enhanced our products to cater for every need and we are now probably responsible for millions of tonnes of paper all over the world.

I still hope for a paperless world but I very much doubt that I am going to see it in my lifetime.

So, if you are still struggling with how to best manage all your physical records please don’t despair, you are most certainly not alone! 

  

What is the future of RecFind? - The Product Road Map

by Frank 19. May 2014 06:00

First a little history. We began in 1984 with our first document management application called DocFind marketed by the then Burroughs Corporation (now called Unisys). In June 1986 we sold the first version of RecFind, a fully-featured electronic records management system and a vast improvement on the DocFind product. Then we progressively added document imaging then electronic document management and workflow and then with RecFind 6 a brand new paradigm and an amalgam of all previous functionality; an Information management system able to run multiple applications concurrently with a complete set of enterprise content management functionality. RecFind 6 is the eighth completely new iteration of the iconic RecFind brand.

RecFind 6 was and is unique in our industry because it was designed to be what was previously called a Rapid Application Development system (RAD) but unlike previous examples, we provided the high level toolset so new applications could be inexpensively ‘configured’ (by using the DRM) not expensively programmed and new application tables and fields easily populated using Xchange. It immediately provided every customer with the ability to change almost anything they needed changed without needing to deal with the vendor (us).  Each customer had the same tools we used to configure multiple applications within a single copy of RecFind 6. RecFind 6 was the first ECM product to truly empower the customer and to release them from the expensive and time consuming process of having to negotiate with the vendor to “make changes and get things done.”

In essence, the future of the RecFind brand can be summarised as more of the same but as an even easier to use and more powerful product. Architecturally, we are moving away from the fat-client model (in our case based on the .NET smart-client paradigm) to the zero-footprint, thin-client model to reduce installation and maintenance costs and to support far more operating system platforms than just Microsoft Windows. The new version 2.6 web-client for instance happily runs on my iPad within the Safari browser and provides me with all the information I need on my customers when I travel or work from home (we use RecFind 6 as our Customer Relationship Management system or CRM). I no longer need a PC at home and nor do I need to carry a heavy laptop through airports.

One of my goals for the remainder of 2014 and 2015 following is to convince my customer base to move to the RecFind 6 web-client from the standard .NET smart-client. This is because the web-client provides tangible, measurable cost benefits and will be the basis for a host of new features as we gradually deprecate the .NET smart-client and expand the functionality of the web-client. We do not believe there is a future for the fat/smart-client paradigm; it has seen its day. Customers are rightfully demanding a zero footprint and the support of an extensive range of operating environments and devices including mobile devices such as smartphones and tablets. Our web-client provides the functionality, mobile device support and convenience they are demanding.

Of course the back-end of the product, the image and data repository, also comes in for major upgrades and improvements. We are sticking with MS SQL Server as our database but will incorporate a host of new features and improvements to better facilitate the handling of ‘big data’. We will continue to research and make improvements to the way we capture, store and retrieve data and because our customer’s databases are now so large (measured in hundreds of Gigabytes), we are making it easier and faster to both backup and audit the repository. The objectives as always are scalability, speed, security and robustness.

We are also adding new functionality to allow the customer to bypass our standard user interface (e.g., the .NET smart-client or web-client) and create their own user interface or presentation layer. The objective is to make it as easy as possible for the customer to create tailored interfaces for each operating unit within their organization. A simple way to think of this functionality is to imagine a single high level tool that lets you quickly and easily create your own screens and dashboards and program to our SDK.

On the add-in product front we will continue to invest in our add-in products such as the Button, the MINI API, the SDK, GEM, RecCapture, the High Speed Scanning Module and the SharePoint Integration Module. Even though the base product RecFind 6 has a full complement of enterprise content management functionality these add-on products provide options requested by our customers. They are generally a way to do things faster and more automatically.

We will continue to provide two approaches for document management; the end-user paradigm (RecFind 6 plus the Button) and the fully automatic capture and classification paradigm (RecFind 6 plus GEM and RecCapture). As has been the case, we also fully expect a lot of our customers to combine both paradigms in a hybrid solution.

The major architectural change is away from the .NET smart-client (fat-client) paradigm to the browser-based thin-client or web-client paradigm. We see this as the future for all application software, unconstrained by the strictures of proprietary operating systems like Microsoft Windows.

As always, our approach, our credo, is that we do all the hard work so you don’t have to. We provide the feature rich, scalable and robust image and data repository and we also provide all of the high level tools so you can configure your applications that access our repository. We also continue to invest in supporting and enhancing all of our products making sure that they have the feature set you require and run in the operating environments you require them to. We invest in the ongoing development of our products to protect your investment in our products. This is our responsibility and our contribution to our ongoing partnership.

 

Technology Trends for 2014 – A developer’s perspective

by Frank 7. January 2014 06:00

I run a software company called the Knowledgeone Corporation and we produce enterprise content management software for government and business. Because it takes so long to design, build and test a new product or even a new version, we have to try and predict where the market will be in one or two years and then try to make sure our product RecFind 6 ‘fits-in’ with future requirements.

Years ago it was much easier because we were sure Windows would be the dominant factor and mostly we had to worry about compatibility with the next version of Windows and Microsoft Office. Apple however, changed the game with first the iPhone and then the iPad.

We now need to be aware of a much wider range of devices and operating systems; smart phones and tablets in particular. Three years ago we decided to design in compatibility for iOS and Android and we also decided to ignore Blackberry; so far, a wise move.

However, the prediction business is getting harder because the game is changing faster and probably faster than we can change our software (a major application).

I was just reading about CES 2014 on ZDNet and the major technologies previewed and displayed there. Most are carry overs from 2013 and I haven’t noted anything really new but even so, the question is which of these major trends will become major players during 2014 and 2015 (our design, develop and test window for the next major release of RecFind 6)?

1.     Wearables

2.     The Internet of Things

3.     Contextual Computing (or Predictive Computing)

4.     Consumerization of business tech

5.     3D printing

6.     Big Data

7.     The Cloud

Larry Dignan, Editor in Chief of ZDNet, wrote an excellent summary of things to think about for 2014, see this link:

Larry sees China and emerging Chinese companies as major players outside of China in 2014 but I think the Europeans and Americans will resist until well into 2015 or later. Coming on the heels of the Global Financial Crisis of 2008 their governments won’t take kindly to having their local high tech industries swamped by Chinese giants. He also talks about the fate of Windows 8 and the direction of the PC market and this is our major concern.

The PC market has been shrinking and even though Microsoft is still the major player by far a lot depends upon the acceptance of Windows 8 as the default operating system. Personally I saw the Windows 8 Metro interface as clumsy and as change for changes’ sake.

I really don’t understand Microsoft’s agenda. Why try to force a major change like this on consumers and businesses just when everyone is happy with Windows 7 and we have all almost forgotten Vista. Windows 8 isn’t an improvement over Windows 7 just as Office 2013 isn’t an improvement over Office 2010. Both are just different and in my opinion, less intuitive and more difficult to use.

Try as I might, I cannot see any benefits to anyone in moving from Windows 7 to Windows 8 and in moving from Office 2010 to Office 2013. The only organization benefiting would be Microsoft and at the cost of big disruptions to its loyal customers.

Surely this isn’t a wise thing to do in an era of falling PC sales? Why exacerbate the problem?

Smart phones and tablets are real and growing in importance. Android and iOS are the two most important ‘new’ operating systems to support and most importantly for us, browsers are the application carriers of the future. No software vendor has the resources to support all the manifestations of Windows, Linux, Android, iOS, etc., in ‘native’ form but all operating systems support browsers. Browsers have become what Windows was ten years ago. That is, a way to reach most of the market with a single set of source code.

We lived through the early days of DOS, UNIX, Windows and the AS/400 and at one time had about fifteen different sets of source code for RecFind. No vendor wants to go back to those bad old days. When the world settled on Windows it meant that most of us could massively simplify our development regime and revert to a single set of source code to reach ninety-percent of the market. In the early days, Windows was our entry point to the world. Today it is browsers.

Of course not all browsers are equal and there is extra work to do to support different operating systems, especially sand-boxed ones like iOS but, we are still running ninety five percent common source and five-percent variations so it is eminently manageable.

Does Microsoft realize that many developers like us now target browsers as our main application carriers and not Windows? Does it also realize that the Windows 8 Metro interface was the catalyst that pushed many more developers along this same path?

Let’s hope that the new CEO of Microsoft cares more about his customers than the previous one did. If not, 2014 won’t just be the post-PC era, it will also be the beginning of the post-Microsoft era.

Is this Microsoft’s worst mistake ever?

by Frank 30. November 2013 06:00

I run a software company called the Knowledgeone Corporation that has been developing application solutions for the Microsoft Windows platform since the very first release of Windows. As always, our latest product offering RecFind 6 version 2.6 has to be tested and certified against the latest release of windows. In this case that means Windows 8.1.

Like most organizations, we waited for the Windows 8.1 release before upgrading our workstations from Windows 7. The only exceptions were our developers workstations because we bought them new PCs with Windows 8 pre-installed.

We are now testing the final builds of RecFind 6 version 2.6 and have found a major problem. The problem is that Microsoft in its infinite wisdom has decided that you can’t install Windows 8.1 over a Windows 7 system and retain your already installed applications.

The only solution is to install Windows 8 first and then upgrade Windows 8 to Windows 8.1. However, if you are running Windows 7 Enterprise this won’t work either and you will be told that you will have reinstall all of your applications.

I am struggling to understand Microsoft’s logic.

Surely Microsoft wants all its customers to upgrade to Windows 8.1? If so, why has it ‘engineered’ the Windows 8.1 upgrade so customers will be discouraged from using it? Does anyone at Microsoft understand how much work and pain is involved in re-installing all your applications?

No, I am not kidding. If you have a PC or many PCs with Windows 7 installed you are going to have to install Windows 8 first in order to maintain all of your currently installed applications. Then, after spending many hours installing Windows 8 (it is not a trivial process) spend more precious time installing Windows 8.1. Microsoft has ensured that you cannot go direct from Windows 7 to Windows 8.1.

Of course, if you are unlucky, you could be living in a country where Microsoft has blocked the downloading of Windows 8, like Australia. Now you are between a rock and a hard place. Microsoft won’t let you install Windows 8 and if you install Windows 8.1 you face days or weeks of frustrating effort trying to re-install all of your existing applications.

 

Here are some quotes from Microsoft:

“You can decide what you want to keep on your PC. You won't be able to keep programs and settings when you upgrade. Be sure to locate your original program installation discs or purchase confirmation emails if you bought programs online. You'll need these to reinstall your programs after you upgrade to Windows 8.1—this includes, for example, Microsoft Office, Apache OpenOffice, and Adobe programs. It's also a good idea to back up your files at this time, too.”

If you're running Windows 7, Windows Vista, or Windows XP, all of your apps will need to be reinstalled using the original installation discs, or purchase confirmation emails if you bought the apps online.”

If the management at Microsoft wanted to ensure the failure of Windows 8.1 they couldn’t have come up with a better plan than the one they have implemented. By making Windows 8.1 so difficult to install they have ensured that its customers will stick with the tried and proven Windows 7 for as long as possible.

Can anyone at Microsoft explain why they thought this was a good idea?

What is happening with the Tablet market?

by Frank 18. August 2013 06:00

I run a software company called the Knowledgeone Corporation and our main job is to provide the tools to capture, manage and find content. As such, we need to be on top of the hardware and software systems used by our customers so that we can constantly review and update our enterprise content management products like RecFind 6 so that they are appropriate to the times and devices in use.

I have spoken in previous Blogs about tablets and form factors and what is needed for business so other than providing the following links, I won’t go over old ground.

Will the Microsoft surface tablet unseat the iPad?

The PC is dead, or is it?

What will be the next big thing in IT?

Could you manage all of your records with a mobile device?

Why aren’t tablets the single solution yet?

The real impact of mobilization – How will it affect the way we work?

Mobile and the Web – The real future of applications?

Form factor – The real problem with mobile devices doing real work

Since my last Blog on the subject we have all seen RT tablets come and go (there will be a big landfill of RT tablets somewhere) and we are now all watching the slow and painful demise of Blackberry. In both of these cases we have to ask how big, super-clever companies like Microsoft and Blackberry could get it so wrong. Just thinking about the number of well-educated and highly experienced marketing and product people they have, it is inconceivable that they couldn’t work out what the average Joe in the street could have told them for free.

Then let’s also think about HP’s disastrous experiment with its TouchPad tablet (another e-waste landfill) and it becomes apparent that some of the largest, richest and best credentialed companies in the world can’t forecast what will happen in the tablet market.

In my opinion the problem all along, apart from operating system selection (iOS or Android?), has been matching needs to form factor and processing power. For example, no one wants a 12 inch phone and no one wants to write and read large documents on a 3 inch screen. This is why most of us still carry around three devices instead of one; a phone, a tablet and a laptop. This is just plain silly, what is the point of a small form factor device if I have to supplement it with a large form factor device? Like most other users, I really just want to carry around one device and I want it to have the capabilities and processing power for all the work I do.

It is for this reason that I believe the next big thing in the tablet market will be based on phones, not tablets. I envision slightly larger and much more powerful phones with universal connectors (are you listening Apple?) and docking capability. I would also like it to have a minimum of 4G and preferably 5G when available.

I want to be able to use it as a phone and when I get to my office I want to connect it to my keyboard, screen and network. I want to be able to connect it to a projector when visiting customers and prospects and I want a dynamically sizing desktop that knows when to automatically adjust the display to the form factor being viewed. That is, I want a different desktop for my screen at work than I want on the phone screen when travelling.

This brings up an interesting issue about choice of operating system as Windows owns about 95% of all business PCs and servers. I have previously never thought about buying a Windows Phone (I had one once a few years ago with Windows CE and it was awful) but my ideal device is going to have to run on the Windows operating system to be really usable in my new one-device paradigm.

I wonder why Microsoft didn’t think of this?

Month List