Data Access Accountability - Who Did What to Your Data When? - A Lumigent Data Access Accountability Series White Paper
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
Data Access Accountability Who Did What to Your Data When? Dr. Murray S. Mazer Chief Technology Officer Lumigent Technologies, Inc. A Lumigent® Data Access Accountability™ Series White Paper Lumigent Technologies, Inc.
2 Overview Enterprises today are reliant on database technology to run their business. The often mission-critical data assets in these databases need to be safeguarded from inappropriate access and data changes. The need to protect data security and privacy has become of paramount concern to most organiza- tions. Reasons for this may include customer or supplier requirements, business practices, security policies, and government regulations. At the same time that business and IT management are con- cerned with addressing these requirements, the IT staff has an ongoing operational need to manage data integrity and availability. Regardless of how this need is driven, the requirement can be characterized as a need for “data access accountability” — or, knowing who’s doing what to which data, and by what means. Meeting these challenges requires an effective audit trail and timely alerting on key events. This in turn requires a number of capabilities: capturing a record of data access and permissions changes, managing the captured information for lengthy periods, flexibly analyzing the information, producing reports, and detecting conditions of interest for timely notification, all the while avoiding perform- ance overhead on your critical systems. Capturing a record of data access is a key step, yet common approaches have important flaws that miss certain kinds of activity, introduce a false sense of security, and interfere with runtime data- base performance. Many organizations, aware of these shortcomings, choose not to implement these kinds of safeguards, leaving these organizations unable to respond to their business needs. Lumigent has introduced Entegra™ as a solution for data access accountability. Entegra provides a complete record of data activity with active monitoring and alerting, answering the questions regard- ing “who did what to which data when, and by what means?” Entegra provides a solution that can reduce the operational and implementation costs and risks associated with other approaches. This white paper examines the need for data access accountability and the alternative approaches to meeting that need. To facilitate evaluation of solutions for data access accountability, a checklist of questions and issues to be considered is offered. The Problem Databases are a critical component of today’s systems, and data integrity is a paramount concern for anyone responsible for those databases. The task of safeguarding data assets is multi-faceted, but a central aspect is ensuring that: 1) data is changed only in intended ways, and 2) only the proper parties view the data. Implementing suitable privacy and security policies and mechanisms is an important step, but it does not address two important realities: 1) even authorized users will sometimes access data inappropriately, whether deliberately or accidentally, and 2) flaws in policy and implementation can introduce vulnerability, enabling unintended data access or database changes.
3 These realities speak to the importance of data access accountability, that is, the ability to deter- mine who did what to which data when, and by what means. This capability allows an organization to: • comply with government regulations regarding the security and privacy of certain kinds of data (e.g., HIPAA, GLBA, FDA Title 21 CFR Part 11, EU DPA ) • comply with internal corporate processes; understand and improve internal business processes • detect and analyze breaches in user and application behavior, intentional or accidental • perform forensic analysis for detecting fraud, outsider intrusion, and employee misbehavior • rapidly respond to violations and vulnerabilities • verify strategic partner activities • verify third-party application behavior • answer ad hoc business questions • satisfy external due diligence for strategic relationships or customer confidence. Technology is key to meeting these challenges. An organization with any of these requirements should engage in a problem analysis lifecycle similar to the following: • identify applicable strategic and regulatory requirements • analyze existing policies and technologies to identify aspects of inadequate coverage • update policies and procedures toward compli- ance IT requirements for data access accountability • educate staff and partners on new policies and procedures • be notified when someone changes data- • identify changes to technology infrastructure to base schema or permissions support implementation and verification of new • keep a record of all changes to schemas policies and procedures and permissions • know what data was changed, when, and • implement new systems by whom • validate the behavior of the new systems. • know who has viewed certain data and when In the end, because technical systems are involved in storing data, technical systems must be involved in • generate periodic reports on who accessed certain tables safeguarding the data. To address this the IT staff will have the following requirements. • investigate suspicious behavior on certain tables • know who modified a set of tables over a period of time • automate procedures across multiple servers
4 Elements of an Effective Solution How a data access accountability solution captures the appropriate data for an audit trail is as impor- tant as determining what data should be captured. Once the appropriate level of detail for the audit trail is determined, an effective solution should provide confidence that: all the relevant activities that create, modify or delete data will be captured, and activities will not inadvertently be omitted from the audit trail. An effective solution providing data access accountability must include these capabilities: Capture data access automatically track whenever data is modified or viewed by any means on target databases, preferably with control over the granularity of data tracked Capture structural changes automatically track changes both to the permissions that control data access and to database schema (to ensure ongoing integrity of the struc- tures storing data) Manage captured information automatically collect the tracked information from multiple databases into an easily managed, long term, common repository Centralized configuration & provide a straightforward way to configure auditing of all of the target management of all servers servers, specify the activities of interest, the repository for managing the information, and the schedule for transferring data Flexible information analysis provide flexible and efficient means for processing the stored information to identify activities of interest Produce reports ad hoc and periodic exporting of analysis results in a variety of formats, for display, printing, and transmission Detecting conditions of automatically monitoring for conditions of interest and generating interest for notification selected alerts. Capturing Data Activity — The Key to an Effective Solution Current approaches to data access accountability are subject to common pitfalls that may, over time, create potential compliance risk or increase the costs of implementing compliance with the business requirement or regulation. These approaches include application modification, mid-tier portals, and trigger-based collection at the data source. Application modification This entails changing the source code of every application that might be used to access the data of interest. Each application is changed so that it captures data modification and viewing information and stores it for further processing.
5 Because each application must be modified, application modification can substantially increase the implementation cost of compliance with data auditing requirements. This may be especially true where legacy systems must be brought into compliance. In addition, because the approach requires each application to be modified and does not capture activity outside the modified application, this approach reduces confidence in the ability to capture a complete audit trail. The audit trail may be incomplete because operations not handled by the modified applications may be missed, or because the audit does not record access directly to the underlying database. The application modification approach may create security vulnerability or risk because of the inabili- ty to capture changes to permissions and schema. Recording who creates, modifies or deletes data is critical to establishing responsibility and accountability for actions initiated. If the audit trail does not capture changes to permissions, it becomes possible to alter who has legitimate authorization to create, modify or delete data. If the audit trail does not capture changes to schema, the door is open to unauthorized alteration of what data is captured. In summary then, the application modification approach has several implications: • each application must be modified (or, if that is not possible, the application must be replaced) • planning, implementing, and testing these changes is costly and time-consuming, and it is diffi- cult to guarantee complete coverage • access outside of the modified applications (e.g., via a database administrative console) is not captured, implying incomplete coverage • changes to permissions and schema cannot be captured by this means. Mid-tier portal Some application architectures funnel access to data through a shared portal that is responsible for backend access. This portal could be modified to capture and store data access information. As with the application modification approach, the mid-tier portal approach to Data Access Accountability may pose potential compliance risks. Operations and access outside of the portal enabled applications may not be captured by the audit trail. And implementation costs may increase, because other approaches to creating audit trails would be needed where data were not created, modified or deleted through portal-enabled applications. This limitation of the portal approach may make it an especially inappropriate solution for many legacy systems that contain data subject to the data auditing requirement. Because the portal approach does not capture changes to permissions and schema, critical security risks or vulnerabilities may arise. Like the risks and vulnerabilities associated with application modifi- cation, the portal approach creates the potential for unauthorized changes to levels of access or to unauthorized alterations to the scope of content captured for auditing requirements. In summary, while the portal approach avoids the modification of individual applications, it has three substantial drawbacks: • it only works for portal-enabled applications • it cannot capture access outside of that passing through the portal • it cannot capture changes to permissions and schema.
6 Trigger-based collection at the data source Most users dread the traditional way of capturing data modifications, using triggers (special-purpose application logic) on the database. Triggers have a number of drawbacks: • they are often hard to write correctly • they add substantial runtime performance overhead (because they execute in line with transac- tions, reducing throughput) • fear of this overhead leads DBAs to minimize the number of modifications recorded or the period over which they are recorded • they cannot capture data views • they cannot capture changes to schema and permissions. Non-trigger tracking at the data source In this approach, non-trigger audit agents are associated with each database server containing important data. These audit agents are responsible for harvesting information about data-related activity, and because they operate at the database server, they capture all relevant data activity, regardless of the application used. In addition, applications need not be modified to accommodate this approach. The audit agents harvest information through two primary means: • reading the database transaction log, which each database maintains in the normal course of its operation, for data modifications. This does not interfere with the timely execution of transac- tions, because the analysis can be time-shifted or carried out on machines other than the one hosting the target database. • using the database’s built-in event notification mechanism to obtain a record of permission changes, schema changes, and data views. Introducing Lumigent® Entegra™ An effective data access accountability solution should enable an organization to meet their data auditing requirements while avoiding the common pitfalls of current approaches. Entegra provides both a complete record of data activity and active monitoring and alerting, enabling answers to the questions “who did what to which data when, and by what means?” These capabilities help the IT staff and DBAs to solve the previously identified business problems and operational needs, and form the core of sound data integrity and accountability practices. In the Entegra approach, non-trigger audit agents are associated with each database server where data access and data changes need to be captured. These audit agents are responsible for harvest- ing information about data activity, and because they operate at the database server level, they cap- ture all relevant data activity regardless of the application used. The design features and capabilities of Entegra reduce the potential for increased costs and risks associated with other approaches to data access accountability.
7 The following components appear in a fully configured Entegra deployment: • Audit Agent: attached to a database server, the audit agent is responsible for harvesting desired information about data modification, data viewing, and structural activity on that server. The audit agent is application independent, though tailored for each database platform. Together with the Management Console component, this enables Entegra to be configured to capture data for the audit trail as determined by the applicable predicate rule and type of operation relevant to the electronic records. The audit agent also provides real-time notification of A complete record of data activity structural changes. provides: • Repository: the repository receives and stores the information reaped from the • Compliance – Archival record of access to data and audit agent(s). Together with the of schema and permissions changes Archive component, this prevents the • Verification – Validate activity on data and schema audit trail from obscuring existing • Security – Reliable independent source of access records and enables copies of the audit and change history to identify responsible application trail to be produced for review. and user • Management Console: the console deter- • Investigation – Enable damage assessment, fraud mines the schedule and configuration of detection, forensics each audit agent for harvesting and transferring information to the reposito- Active monitoring and alerting provides: r y. Together with the Audit Agent com- • Security – Reliable notification of changes to permis- ponent, this enables Entegra to be con- sions, which can provide validation of proper activity figured to capture data for the audit or an early indication of malicious intent, violations, trail as determined by the applicable and vulnerabilities predicate rule and type of operation rel- • Integrity – Reliable notification of change to structure evant to the electronic records. permits verification of correct implementation and • Report server with browser interface: the rapid response to incorrect changes. report server provides secure mecha- nisms for processing the repository for query, analysis, and reporting. The report engine provides a familiar, richly featured, web-based user interface for highly interactive analysis of the repository information. This enables analysis of data activity and copies of audit trails to be produced for review. • Archive: the archive provides long-term storage of access information from multiple repositories. Together with the Repository component, this prevents the audit trail from obscuring existing records and enables copies of audit trails to be produced. The archive can be maintained for any given amount of time, which may be dictated by the data auditing requirement.
8 Conclusion The ideal technology solution to address compliance and regulatory requirements depends on an effective data capture capability. The best approach minimizes performance overhead while providing a complete audit of data access, as well as active monitoring and alerting. A number of approaches may be considered, though many have shortcomings that negatively impact system performance and/or require additional technical resources. All things considered... In parallel with the development of policies, procedures, and technical requirements, it is critical to evaluate solutions that effectively capture and audit data access. The following checklist is offered to facilitate evaluation of data access accountability solutions: • Is data access capture complete (no backdoors through which a user may access data without being detected)? • What is the cost of deployment? • Does the solution use triggers (may impact performance)? • Does the solution require application software modification (time-consuming)? • Is it easy to administer? • Is there a single console for configuration and scheduling across multiple database platforms? • Is there a common repository and long-term archival support? • Does it provide a complete modification history? • Does the solution “alert” on critical database changes (schema and permissions)? • Does the approach support multiple platforms? About the Author: Dr. Murray S. Mazer is co-founder, Vice President, and CTO of Lumigent Technologies, creators of the award-winning Log Explorer® technology. Dr. Mazer has 20 years of experience at early stage and established companies. He is an inventor and expert witness in several different software technology areas. Dr. Mazer received his Ph.D. from the University of Toronto for his work within the database group. He can be reached at murray.mazer@lumigent.com. For more information please visit www.lumigent.com © 2002, Lumigent Technologies, Inc. All rights reserved.
You can also read