Skip Navigation Links
Reading Room

Video Room
KAPPA brochure
Software technical summary
Training & Consulting
Unconventional Resources
Ecrin v4.30.05a
Citrine v1.0.01
KAPPA Server v5.0.04
KAPPA Viz v5.0.01
Diamant Master v4.12.07
Emeraude v2.60.12
Ecrin v4.20.07a
KAPPA Free DDA book
Shale Gas @ KAPPA
All downloads

You are NOT logged in

(Click here to login)
Main page
Your account
Bug report
Price list
Course booking

RSS feeds

Reservoir Surveillance of Dynamic Data


Permanent Downhole Gauges (PDG) are a remarkable source of information of both long term production data and the capture of occasional buildups that may be described as ‘free well tests’. Data are acquired at high frequency and over a long duration. The down side is the large number of data points gathered, which can amount to hundreds of millions per sensor which is far beyond the processing capability of today’s fastest PC. There are a number of challenges: storing and accessing the raw data, filtering, transferring this to the relevant analysis module and finally sharing both filtered data and analyses.
Diamant Master is a client-server solution for reservoir surveillance that addresses these issues in a shared environment. It permanently mirrors raw data from any data historian, reduces the number of points with wavelet-based filtering, stores and shares the filtered data. Filtered data can be exported to third party databases.
Derived data can be created and updated by user controlled mathematical operations on existing data. Boolean alarms can be created and used over a network. Diamant Master also stores technical objects and maintains the data with enterprise-wide consistency avoiding the need for repetitious data handling and speeding the workflow. Diamant Master is administered, and partially operated by a WEB client.
Diamant main window

What PDG data provides
PDGs acquire pressure data at high frequency and over a long duration. A typical data set will include two types of information; each spike is an unscheduled shut-in that may be treated as a ‘free’ well test for PTA. In addition the long term global producing pressure response, ignoring these spikes, can be used in association with the well production to perform production analysis and/or history matching.
The data is there and it is already paid for. It is ‘simply’ a matter of getting at and interpreting the data. Nice idea, one not so little problem; the available data is vast and growing. For one single gauge there are typically 3 to 300 million data points. This will bring even the fastest of today’s PCs to a grinding halt. But we need both short-term high frequency data for PTA and long-term low frequency data for PA.
Wavelet filtering
To perform a transient or production analysis we typically need 100,000 data points. The trouble is it is a different 100,000 from the same dataset. To obtain both, Diamant Master (DM) uses a wavelet algorithm. The decomposition is presented in the diagram below.
Wavelets may be described as a ‘smart’ filter with a threshold. For each point the local noise is estimated for different frequencies. If the local noise is above threshold, as occurs for pressure breaks when the well is shut in, this is considered significant and it is kept. The wavelets act as a high-pass filter. Conversely, if the noise level is below threshold, this is just noise and it is filtered out. The wavelets act as a low-pass filter. As a result, producing pressures will be filtered out and reduced to a few points per day, while all early shut-in data will be preserved.
Diamant Master workflow
Diamant Master is a continuous process installed on a dedicated machine running Windows Server™. It is operated by engineers (subject to privilege) using Ecrin or a WEB client. Users can navigate through the historian databases and select the tags to be imported. Diamant Master remains connected to the historians, from which it sequentially mirrors the raw, unfiltered data. For each mirrored data set, users with the right privilege may define, for each tag, one or several wavelet filters. The filters will be executed on user request or automatically when sufficient new data have been mirrored. Users may also order partial reloads of legacy data with a different filter setting, or no filter at all.
The filtered data is stored in the local DM database to be subsequently sent to Ecrin analysis modules on a single drag-and-drop. This data may also be exported to a third party database.
Diamant Master stores KAPPA technical objects and files in a hierarchic and intuitive structure to be shared by Ecrin interpretation modules.
Connecting to data
The beauty of standards is that there are so many to choose from. So it is in the Oil Industry; there is no standard way to store PDG data. Almost every provider so far has its own data model, and Operators routinely have several providers as well as their own data model. Most databases have low-level access (ODBC, OLEDB, OPC, etc), but this is, at best, cumbersome for end users. Each data model requires a specific adaptor to navigate and access data. A published API permits the development and connection to customized adaptors, automatically downloaded by Ecrin from Diamant Master.
Data processing
When connecting to a new tag Diamant Master proceeds with a quick data scan of one point in every ten thousand to preview the data and help in spotting anomalies and gross errors. A user defined data window can immediately discard obvious outliers. A first series of points, typically 100,000 or one week of data, is then used in an interactive session for the engineer to adjust the wavelet setting and data post-filtering, based on a maximum Δt and Δp. Upon user acceptance the filtering is performed using overlapping increments the size of the initial sample.
Derived channels
These are user defined and permit mathematical operations on data channels with a comprehensive formulae package. The outcome may be another data set or a Boolean function of time that may be used to create an alarm. The outcome of the alarm is to display, in the Diamant window, the execution of an alarm E-mail, or the call of a user defined DLL.
Identifying shut-ins... automatically
Until recently all algorithms, including wavelets, failed miserably to automatically identify shut-ins, especially when data sets were showing both soft and hard shutins. Diamant Master has an exclusive algorithm which automatically identifies shut-ins. Years of PDG data can be scanned in seconds. Shut-ins are identified and made available to the user for analysis in Saphir NL. This was the missing link to allow full automation of the data processing. The times of shut-ins may also be used to modify the production history in order to honor both shut-in periods and cumulative production.
Transferring data to Ecrin analysis modules
Filtered data can be transferred to any Ecrin analysis module by drag-drop. Shut-ins are analyzed and compared using Saphir NL, producing rates and pressures can be analysed and matched by Topaze NL, and filtered data may also be used to constrain Rubis models.
Express shut-in(s)
With the transients identified and daily rates correctly allocated and cleaned, shut-in data can be sent, en masse or individually, to a Saphir NL document automatically created by Ecrin. The result can be the latest shut-ins or a cloud of transients from previous years that may be analysed together, as a selected group or discretely.
WEB access and administration
Diamant is the best way to handle data, technical objects and files when using KAPPA applications. However these can also be accessed from an Internet browser by connecting to the DM server IP address or its name in the domain. The engineer can view the status of the different processes, access the data tables and technical objects and recover the filtered data in Excel™ format without using Ecrin. An ActiveX control can also be loaded to navigate the data structure in the same environment as Diamant.
What’s next ?
Diamant Master v4.12 is compatible with both Ecrin v4.12 and v4.20. A major change of generation is currently taking place. KAPPA Server v5.0 will replace Diamant Master v4.12 in 2012.
KAPPA Server will be our first Generation 5 product, developed under the DOT.NET environment. It will be interfacing with Ecrin v4.20 and it will integrate our second generation of wavelet filters, allowing raw data to be filtered with no prior interpolation.
Typical PDG data response gathered over two weeks

Wavelets decomposition and denoising algorithm

Wavelets denoising: (1) raw data = 10,000 points; too low (2), too high (3) and selected (4) thresholds; (5) post-filtration; (6) filtered data = 70 points

Diamant Master processes

Automatic Shut-in identification in Diamant Master v4.12

Multiple build-up analysis in Saphir

History matching in Topaze

Build-ups indentified in Diamant and ready to send to Saphir

Generation 5 interface

Generation 5 interface

A second generation of wavelets