What Does the DAT (or Data) File Tell Me About My Abaqus Finite Element Analysis (FEA) Job?
top of page

What Does the DAT (or Data) File Tell Me About My Abaqus Finite Element Analysis (FEA) Job?

Updated: Dec 9, 2020

Welcome to another installment of our “Debugging 101” blog series aimed at helping CAE & FEA analysts more efficiently debug FEA models. Today’s post focuses on the .dat, or data, file from an Abaqus finite element analysis job. In previous articles, we've discussed where to start when trying to debug an Abaqus model as well as looking at the .sta, or status, file in a lot more detail.


This blog is intended to help Abaqus users understand the basics of the .dat file by highlighting many of the things that can be deduced from a standard Abaqus data file. Although there are several additional output requests that can be written to the data file, this post will cover only the basic information that is provided for most Abaqus submissions.


On a high level, the role of the .dat file is to summarize the information generated during the analysis input processing phase of a job submission. Input processing is performed by the executable “Pre.exe” and is a prerequisite for all analyses prior to beginning the job solve (which is performed by Standard.exe for an implicit analysis). The input processing phase is typically quite quick and is responsible for a number of operations, which are discussed below.


Parellelizing

 
 

Firstly, the input processing phase distributes the analysis between multiple processors if more than one CPU core is being used to solve an analysis - and with today’s available computing power, this is almost always the case (although, certain analysis techniques prohibit parallelization). Parallelizing occurs when a process is broken down into smaller parts and distributed evenly across several processors or cores. This results in reduced run times since analytical equations can be solved in parallel rather than in sequence. By reviewing the .dat file, you can see how different parts of the analysis (e.g. contacts, elements and nodes) have been parallelized, as shown in the example above.


Adjustments to Node Locations

 
 

When contact or ties are included in an analysis, Abaqus will often adjust the initial position of certain nodes in order to fulfill the contact specifications defined by the analyst. This is usually done in order to ensure that the slave nodes reside directly on the master surface of a given contact pair (which drastically improves contact convergence). When this happens, the .dat file will report which nodes were moved and by what distance as well as the new coordinate locations of any adjusted nodes. Typically, when a large number of nodes have been moved, Abaqus will only print the first 20 to the .dat file, although it is certainly possible (and often advised) to output all adjusted nodes using the *PREPRINT option.


Element Quality Check

 
 

The Abaqus input processing phase also performs an assessment of element quality. Here, you’ll find a listing of all elements that fall outside the suggested quality criteria defined by Abaqus. As evidenced in our example above, tetrahedral elements whose angles between isoparametric lines fall outside the preferred 45-135º range will be printed to the .dat file by default. The resulting list includes the element number for all failed elements as well as an overall quality measure, which, at a glance, can help analysts identify the elements that may require improvement prior to solving an analysis.


Don’t worry too much if you see this in your models though - in fact, almost all models of moderate complexity will contain at least a few out-of-shape elements. While you should certainly pay attention to distorted elements and ensure they don’t reside in areas of the model that are of particular interest (e.g. high-stress areas), it is not always necessary to improve the quality of all elements printed to the .dat file.


Checking for Modeling and Syntax Errors


While the printed information previously discussed can be helpful when debugging a model, the most valuable information that is provided in the .dat file is related to modeling and syntax errors. During the input processing phase, Abaqus checks all keywords and makes sure that there is no missing information that will prevent the analysis from solving. In the event that any required information is missing, Abaqus will abort the analysis and provide a description of the underlying reason. For example, forgetting to assign a material definition to a property card or defining contact pairs in which slave nodes have multiple masters will be detected and identified within the .dat file.


e.g. ***ERROR: n elements have missing property definitions


and believe me… we’ve all forgotten to define a material or section property at one time or another.


Although missing information is the most common reason for an analysis to abort during the Pre.exe phase, there are also syntax errors, which can crop up when manually editing the input deck and will prohibit an analysis from starting. For example, if you misspell a keyword or enter too many arguments, this can cause a syntax abort. We tried to run a model with the card ‘*STEPP’ in it and the following error was generated:


***ERROR: Unknown keyword "STEPP". The keyword may be misspelled, obsolete, or invalid.

***ERROR: in keyword *STATIC, file "TEST.inp", line 1489022: The keyword is misplaced. It can be suboption for the following keyword(s)/level(s): step


In this case, the .dat file is letting us know that Abaqus doesn’t recognize the misspelled card, *STEPP. It then also complains that the *STATIC card is misplaced since *STATIC must be a sub-option of the (missing) *STEP definition. However, because *STATIC is recognized by Abaqus as valid, but misplaced, it identifies the line in the input deck in which the “error” occurs (line 1,489,022).


Summary of the Analysis to be Performed

 
 

Lastly, the dat file provides some useful information regarding the total size of the problem being solved. This is reported in terms of nodes, elements and variables in the finite element model. Abaqus also estimates the minimum and optimal amount of memory (RAM) required to complete the analysis. This is incredibly useful when running large jobs as it allows the user to select the appropriate hardware to achieve the most efficient solve time. If the “Memory To Minimize I/O” is greater than the available system RAM, then some information must be swapped back and forth between RAM and disk memory, which results in a substantial increase in the analytical solve time.


Final Thoughts


The information printed to the .dat (or data) file can be very useful when debugging or post-processing. As we’ve discussed in this post, it can be used to identify syntax and modeling errors that prevent an Abaqus solve from beginning as well as to understand the checks and adjustments that are performed prior to the analysis execution.


As with the status file, the information in the data file is not only useful when an analysis fails. Ensuring element quality in the key regions of your model can be the difference between meaningful results and misleading calculations and understanding memory usage can help you optimize your simulation hardware strategy.


Whether you’re an experienced Abaqus user or a complete beginner, Fidelis can help you get the most out of the software with bespoke support and fully integrated simulation solutions.

3,306 views
bottom of page