Tags


Latest Posts


Latest Comments


Authors

Taking the pain out of data storage for legal firms

jonathon-birch-thumbnail.jpg

Posted by |

While most of the data-related headlines these days seem to revolve around high profile hacks and breaches, the law firm CIO will often find their data landscape coloured by more mundane daily aggravations - the failed back up, the corrupt tape, the accidental deletion of a file that's needed right now but the only copy of which is sitting on a disk halfway across town.  Yes, data security is a major headache, but data storage is a pain somewhere else altogether.

The volume of data held on corporate systems is growing exponentially, fuelled by the increasing use of rich content, remote working and email. Today over 100 billion emails are sent daily with 140 billion predicted by 2018. Traditional technologies are struggling to keep pace with storage and back-up requirements that can grow by up to 50% annually - and more and more money and effort is needed just to stand still.

Some organisations are now seeking salvation in Cloud-based data management solutions and its obvious attractions: the offloading of responsibility to a third party, the move to operational expenditure, the easy scaling up of additional capacity, the securing of enterprise grade disaster recovery, the improved data controls and compliance, the list goes on.

But even they can be guilty of the behaviour that is at the root of the data management challenge, and that is our tendency to store way too much: backing up anything and everything, which is further compounded by a lack of business rules and inadequate retention and destruction policies. Moving to a managed data solution can certainly treat the symptoms - riotous growth - but take it one step further and it can also directly address the cause: that basic lack of rigour when it comes to data management.

Multi-tier data management is an approach that looks to put an end to organisations treating all its data as one homogenous lump, invariably using expensive primary or secondary storage to do so. Why?  Because whereas today's transactional data is business-critical, a four-year old email is just a business overhead. So why not tier data according to its value instead? Data tiering is based on the premise that all data is automatically accorded the most appropriate treatment: replication for mission-critical data; backup for critical data; and archiving for important/legacy data.

This simple expedient enables some 70% of data volumes to be instantly removed from expensive primary storage and placed on hugely cost-efficient tertiary storage; the remainder can be placed within a storage environment geared for high availability and fast recovery, at an overall cost that is still lower than traditional high-end backup and DR solutions.

And simple expedient it may be, but you are also looking at a quantum shift in the dynamics of data storage - and plenty of resultant benefit too: lowering of overall spend on data and information management; greater control over data growth by classifying and archiving inactive data; reduction of primary data and thus reduction of backup data and backup window, shortening Disaster Recovery times; more centralised management of data stores for easier compliance and edisclosure; increased data availability by lessening the load on primary servers and storage; and more resources, money and energy freed up to spend on 'added value' projects.

Data storage remains a major pain point for legal. But with a tiered approach, you have a very effective, very affordable means of pain relief.

Comments

 

Post a comment

Comment submitted! Comments needs approval before being displayed.