Use of Computers in Forensic Engineering
N. Krishnamurthy, Ph.D. Structures and Safety Consultant, Singapore
1. INTRODUCTION
Computers have been with engineers for the last four decades so much that we now depend on them more than we depend on ourselves! Naturally, when it comes to investigating accidents and finding their primary and secondary causes, computers have a very important role to play. The 'Sherlock Holmes' lens of today is the computer through which complex mysteries can be resolved, details which were hidden may be brought to light, and so on.
Computers are extensively used for data management, statistical analysis, structural, geo-technical and other evaluations, and for computer slide or video presentations in court and elsewhere. Needless to say, a forensic expert who wishes to use the computer in his work better be quite thorough with the concepts, principles, and details of computer applications, as well as on the spot manipulation of computer calculations and images. A forensic engineering expert looks weak if he has to call upon a computer expert in court!
[NOTE: Use of male pronoun or other reference would automatically cover the female equivalent except when either one is implied by context. - NK]
Lawyers on each side will have computer experts to debate with the testifying expert from the other side. Judges also take the trouble to study up for the case, and may ask very probing questions which may stump an expert more than the complex analysis itself. "How does the pixel resolution affect the accuracy of your results?" was a question I was asked by a judge.
In my fifty-five years of active service, more than fifty of them have been spent with computers, starting with second generation mainframes right up to today's powerful desktop PCs, and from BASIC and FORTRAN to 'C' language (after which I stopped learning new languages), and using numerous sophisticated analysis packages.
During all that time, I had the opportunity to use computers to:
(a) Develop software for my own research and as consultant to private companies in USA, Singapore, and India, and to the Governments of USA and Singaapore;
(b) Publish numerous papers and lecture widely on computer applications, write a book on computer graphics, and contribute a section on finite elements to a book on implant dentistry; and,
(c) Consult for accident investigators (the term 'forensic' came later) and expert witnesses in USA with computer analysis of structural failures and accidents, and serve as forensic engineer and expert witness in Singapore.
Now the computer is my strongest ally in my forensic work. I would not have been able to get thus far in or contribute this much to forensic engineering without it.
2. DATA MANAGEMENT
In our cyber world today, the computer is the storehouse and repository of all data on people, materials, equipment, processes, correspondence, financial matters, and everything else that is remotely connected with all our thoughts and deeds.
When an accident happens, the first act of the investigator should be to collect and secure any and all data available on the event, people and artefacts connected with it at the site. Time is of the essence. Computerisation of data is the most efficient means to securing data from loss, corruption, or manipulation. Computers are today the most effective way of documenting chain of custody/evidence to authenticate the expert’s testimony.
2.1. Source of data
Data capture is fortunately easy and fast these days. The digital camera has revolutionised the visual medium. Almost every office and shop, most homes and condos of the rich (and in places like Singapore even government-built flats), many localities with unsavoury reputations, and other sensitive areas are watched by CCTV cameras. We get one chance at data capture, and we must grab all we can. We never know what will appear in which picture that may become the turning point in an investigation.
Normally the best source of data is the accident site itself. However, as I have explained in another paper, [1] if you are called in to investigate an accident after the site has been cleaned up, a site visit may not be practically useful. In such a case, official testimony and authenticated stored data are your only resource, and then the computer is your last hope.
The beauty about computerised data management is its reproducibility and accuracy, almost unaffected by who does it, so that there could be no question of bias or alteration, and once thoroughly checked during the first round, no question of human error. It makes for a good evidence trail.
In my own forensic preparation, I utilise the computer for all data management from storage, search, or data mining from the web.
2.2. Data and Testimony
Computerisation of data is also useful for further analysis, for transfer of information and findings to other parties in the loop, and for presentation during testimony, or for validity check of the other side's testimony.
In one exchange with a witness involving the possible failure of a temporary structure due to overloading, I had a slide of the construction site projected on the screen and pointed out that contrary to defence claims, there was no ‘designated area’ marked out for the heavy load to be deposited by crane, as required by design regulations. It had also not been marked on drawings.
In another court appearance, I had occasion to provide technical commenatry on the various stages of how the victim fell, through the CCTV video of the accident moved frame by frame.
2.3. Nature of computer evidence
Computers are as necessary to modern forensic work as they are suspect in their integrity. As most people know, with a computer, any desired text or numerical output can be fabricated, any existing image can be altered, and any new desired image can be created. Because computer processing is invisible to the human eye, courts are very strict in accepting evidence based on computer-created text and images. The only two ways out of this problem are:
(1) Pre-certification by the source and chain of custody of data, as with Government documents; and,
(2) Post-authentication by highly specialised and accredited computer labs, based on meta-data and very, very tedious and expensive tests.
The same scepticism may also apply to impressive results based on computer analysis. As it is practically impossible to directly check the accuracy of computer output in say a finite element analysis, the only way is to independently run the analysis on the same or equivalent software with the original data, and compare the results.
If the two results differ drastically, it gives the opposition an opening to pick a hole in the input. It also gives them a chance to argue that the submitted results are 'mis-represented' (meaning manipulated) and to demand an explanation for the difference.
3. COURT PRESENTATIONS
Today's courts are a far cry from courts of a few decades ago, although decades-old practices continue in some nations which do not have modern infrastructure. Almost everything in courts is now computerised, from record keeping and scheduling, to facilitating visual graphic presentations, both still and video.
Visuals are unavoidable in forensic engineering presentations, to enable the judge, lawyers, witnesses, the media, and other key spectators to see and understand the evidence clearly. Earlier, witnesses would display visuals through flip charts, overhead transparencies, and/or huge photo enlargements. Then 35mm slides projected on a screen served well for many decades. The current popular mode is LCD projection from computers, most commonly with slides presented through digital tools such as MS-PowerPoint.
Small scale models of accident sites and artefacts (or conversely large-scale models of very small items) were commonly used to give all parties a clear idea of relative locations, sizes, and functioning of different components. In traffic accidents, models of vehicles involved and the street and intersection layout were routine. Mannequins were used to point out body parts and postures if human beings were involved.
These strategies are still used in court to give a 3D view of things and actions. In a case I testified on the failure of a rebar grid, my lawyers had a model made of the cage to point out various parts and locations.
More and more, rather than large size prints or physical models, computer renderings of models or of the objects themselves are accepted for testimony, with mutually agreed upon stipulations on their admissibility for presentation, and for further proof of authenticity and representation as occasion demands. Virtual worlds are as common in court as in movies!
Computers do speed up presentations, and make the proceedings much more interesting and meaningful than otherwise.
4. MATHEMATICAL ANALYSIS
Current use of mathematics (in my area of speciality structural engineering) ranges from basic analysis by classical principles (even hand-worked with the aid of calculators) to highly refined matrix and finite element analysis, vibration and earthquake analysis, etc. by computers.
Validation of computer analysis is critical in forensic engineering because the investigator cannot simply do his favourite thing with the computer and expect it to be accepted in court as gospel. He must anticipate arguments from the other side, try various computer models and be able to justify his particular choices, and recommend one finding over others with credibility.
4.1. Pitfalls in computer packages
Along with the advantages of computers come certain limitations. Any computer package is a product of one or more minds, its power and validity depending only on sum total of the experience and expertise of those minds.
Most computer packages today are so massive and complex that a user cannot look into its programme guts, and even if he could, he would not understand much. At the same time, these are the very reasons the user should clearly understand the following:
- Particular principles or theories which have been adopted in the software;
- Assumptions on which the programme has been written;
- Criteria and limitations to various modelling options in the input stream;
- Transformations and approximations used to get and present the results; and,
- Changes from one version of the package to the other.
Packages offer choices of 'models' and idealisation options for the user to choose from, and that is where the novice or over-confident user can go wrong. The craze for the latest version of a popular package overlooks the truth that the user might not have exploited even a fraction of the capabilities of the previous version! Where human judgement is necessary, automation may not always give the right option under the circumstances.
The final big hurdle in computer software use is that the numerical output is so voluminous that nobody can wade through them and extract the most critical values he needs, to decide between alternative designs or to proceed for further analysis. So we have to depend on the summaries and graphical representations as decided by the programmers.
In other words, unless one is careful, everything except the input of the bare essentials is out of the hands of the user of the package!
4.2. Matrix and Finite Element Analysis (MFEA)
MFEA probably leads computer use in forensic civil engineering investigations. These days, most finite element analyses (FEA) are massive jobs, running to millions of equations, with powerful software and hardware to match. Because of their ability to probe deep into the behaviour of solids, the expense involved may be worthwhile in major accidents.
The main advantages of MFEA are:
- Today’s desktop computers can handle huge problems within a short time.
- Interactive graphics will not only check the validity of your input but can actually facilitate it through helpful graphics and text entry wizards.
- Data validation and error correction guidance are generally built into most software.
- Results are displayed graphically so that you can confirm the reasonableness of the answer immediately.
- With a few keystrokes you can change essential parameters of the computer model, and see their effect on the results right away.
Subject to the caveats mentioned earlier, computer results are the best legal ('forensic') evidence, to the extent that anything the expert has done should be reproducible and can (and often will) be questioned and checked by anybody in the loop, particularly the other side.
5. COMPUTER ANALYSIS IN FORENSIC ENGINEERING
Apart from the modelling of the structure itself, questions will arise on the exact site conditions of material properties, field connections, supports and loadings. That is where deviations from design and intended use may occur. The analyst would have to try various combinations of parameters before the result gives indications towards possible causes. A couple of recent examples from public domain, and one case study from personal experience will be reviewed.
5.1. Collapse of steel bridge on I-35W at Minneapolis, USA, 2007
About 6:05 p.m. on Wednesday, August 1, 2007, the eight-lane, I-35W highway bridge over the Mississippi River in Minneapolis, Minnesota, USA, failed catastrophically. The central 1,064 ft. (324m) long deck truss portion of the bridge collapsed, together with adjacent sections of approach spans. 13 people died, and 145 people were injured. Figure 1 displays the collapsed bridge with the inset showing a view before collapse.
Gusset plates were suspected to be the failed element; Figure 2 top right depicts one of them. To confirm and quantify their role in the collapse, consultants Simulia using the ABAQUS package conducted an elaborate finite element analysis (FEA), [2] of the entire bridge and of two of the failed gusset plates marked in Fig. 2, top. The other three parts of the figure show the actual failed gusset plate for the upper joint recovered from the river, the 3D finite element mesh with 2.7 million degrees of freedom (so fine we cannot see the mesh!) and the von Mises stresses clearly marking the failed zones.
Fig. 1. I-35W bridge, before and after
Fig.2. FEA of bridge collapse on I35-W at Minneapolis, USA
5.2. Fire effects on WTC beam and slab floor, 2001
Fig. 3. Fire effect FEA

Apart from stresses and deformations, FEA can also determine effects of temperature, vibration, impact, creep etc. Figure 3 shows some results of the forensic analysis of fire effects on World Trade Center - 5, part of the attack on 11 September 2001, [3]. This building was not directly hit by terrorists, but raging flames in the Twin Towers spread to WTC-5 and caused localised collapse of the top five floors.
The FEA solution by ABAQUS package was a heat-transfer thermal-stress analysis and its results agreed quite well with the observed fire damage. The structural model included spray-applied fire insulation and concrete slab on steel framing, as in Fig. 3, top left, for a shear connection. The top right figure shows the temperature distribution near the connection after two hours of the fire exposure.
Bottom left part of Fig. 3 shows von Mises stress distribution and deformations after two hours, and bottom right part shows comparison of FEA and actual forensic evidence of the torn plate, confirming bearing failure at the bolts. The analysis also confirmed that the insulation delayed heat transmission to the steel, and the concrete slab served as a heat sink as anticipated.
5.3. Collapse of rebar cage
A major finite element application to forensic civil engineering came my way when I was invited to investigate the collapse of a rebar cage for 3m and 5m reinforced concrete slabs. I have touched on some aspects of it in another paper, [1] which may be referred to for background. I did scores of MFEA, first to check the analysis and design submitted by the designers and contractors, and next to try various possible support and loading scenarios to explore the failure.
Fig. 4. Configurations analysed by FEM
Figure 4 displays many of the support system configurations I analysed, using STAAD-PRO. Although the biaxial symmetry was a compromise for detailed analysis to save unduly long runs, a few coarse mesh runs were made on the full domain for unsymmetric loadings to get an idea of how big the differences would be.
Figure 5 depicts one of the larger computer models of the failure zone. This was just one quadrant, so that the actual structure would have had four times as many joints and members.
Finite element analysis of any practical scope is not a one-person job. The lead investigator has to conceptualise and lay out the model and its 2D or 3D configurations, material properties, support conditions, loadings etc. But the preparation of input is quite time consuming and highly software dependent, and better done by a person trained in that particular package. The investigator should of course know enough about the package to ensure that the data developer is on the right track, ask intelligent questions and offer sensible suggestions. Accordingly, I used the services of an experienced assistant to prepare the input, carry out the runs, and document the output, all under my supervision and instructions.
Fig. 5. Computer Model of Quadrant
A key answer I sought was the factor of safety against failure under worst loading conditions, and it was the computer that enabled me to determine the value under various scenarios within a reasonable time frame. Although further root cause analysis could not be made purely on scientific grounds or factual basis, I was able to present credible scenarios of failure and demonstrate many weaknesses in the original design. As an expert witness, I had done my job and given my testimony, thanks to the computer.
Incidentally, the accumulated computer-generated information had a side benefit: Forensic engineering often highlights weaknesses in design or construction procedures, and facilitates their rectification to help future users.
In this case, defendants opined that F.S. of 1.5 would be enough just as for permanent structures; but based on my findings, I had insisted that we needed a minimum of 2.0 – which had also been recommended by the Committee of Inquiry for the Nicoll Highway collapse earlier, [1]. At the time of both collapses, the low load factors had been admissible, and so my opinion was just another recommendation and not grounds for a charge.
In due course, when as part of the Workgroup of the national Formwork Code Committee, I was assigned the responsibility to develop the design section, I took the opportunity to steer the move to set minimum safety factor for formwork at 2.0, for any method of design or testing.
5.4. 'Playing around' with computers
Forensics is actually being a detective, going backwards in time, trying to find out the cause of some mishap that has already happened. In order to succeed you need a tool that can help you try out various scenarios and check if, how, and how well any one scenario fits known facts best.
For this guessing exercise, the computer is the ideal tool. Before the advent of computers this used to be a mind-game, a paper task, or a lab experiment, all complex and/or time consuming. The computer eliminates both deficiencies. It is precise, fast, and relatively inexpensive.
Because of the ease and speed of computer analysis, whenever I investigate a new problem which requires computer analysis, I use the model I have created to not only answer the questions posed but also to ‘play around’ with it trying various other scenarios, as much to understand my solution better, as to learn anything more that can be gleaned from it.
This approach has given me some valuable insights. If I had not found out some unexpected problems during my playing around, an accident or failure might have happened sooner rather than later - should I then call such playing around, 'preventive or pro-active forensics'?
Laying steel sewer next to R.C. tunnel:
One such episode occurred when, while at Auburn University (1967-1975) I was consulted on the safety of a (then) 60-year old reinforced concrete sewer pipe during the laying of a new adjacent steel culvert, in the neighbouring city of Columbus, Georgia. (Figure 6, a, b, c.)
I chose FEA for it - a daring move at the time - and showed that when the tractor crane hauling the steel pipe rode over the buried concrete pipe the old sewer might just make it through. I recommended placing spreader steel plates under the track and driving slowly.
Over the phone, the client asked, 'What if the sheet pile tie-backs slacked off during use'?
No problem; I input a top displacement slack of half an inch (the specified tolerance) into the data and came out with the result that it would decrease the maximum stress by about 5%. That ended my assignment. They were going to launch the project the next morning.
Fig. 6 Playing around with computers on R.C. sewer
But overnight, the thought struck me that the tieback might as well be over-tightened during the process of erection and adjustment, rather than slackened.
So I sent word to the client and asked them to hold off the culvert laying by a day. Next morning, I rushed to the computer centre and waited for it to open.
The re-analysis for half-inch over-tightening gave me 10% overstress, which would almost surely have destroyed the old sewer. I called the Columbus city fathers and advised them to make sure that there was no over-tightening.
Also, I had second thoughts on what the tractor load would do to the soil under these fluctuating tie-back loads. So I recommended that they avoid loading the old sewer at all, and reach the excavation from the other side. It meant some additional time and expense, but they appreciated the forewarning - a failure would have meant court cases and hundreds of thousands of dollars (today millions!) worth of claim settlements.
6. FAILURES DUE TO SOFTWARE
6.1. Sources of computer-related failure
We will not be discussing the misuse of the computer itself, such as spam, virus, hacking, identity theft, or other aspects of cyber crime, which is a separate field in itself. Computer hardware error also is very, very rare; the Pentium FDIV bug of 1994, [4], giving error in the fifth significant digit during division, is the only one in recent memory.
But there have been many errors or limitations in computer software which have gone unnoticed for long and found to be the reason for many wrong analyses and bad designs. Recent history is full of accidents and failures due to misuse of computer programs.
The latest software error reported by Confidential Reporting on Structural Safety (CROSS), [5] reads thus: “... a current package for pad foundation design has no factor of safety against overturning, returning a ‘pass’ for unfactored loads and a utilisation ratio of 1. Furthermore it has been noted that the same program returns a ‘pass’ without checking the bending capacity of the base in hogging. ... The suppliers ... have undertaken to correct them at some point but there may be many pad foundations in use which have been incorrectly designed using this program.
"The Editorial on it comments thus: "... there is concern about reliance on computer output and there should always be a check to ensure that results are sensible. In simple cases such as a pad foundation this can be an approximate manual calculation."
In the USA of the 1960s - the days of 'Lone Ranger' programmers - I had the 'luck' to catch a bug in a matrix analysis programme I was given during a short course, and a data input glitch in SAFE-3D, the first 3D finite element analysis programme. Needless to say, I too have suffered the pangs of bugs others found in my software, luckily before they caused any damage!
Software errors are also becoming less frequent these days as reputable software houses have very stringent check and test procedures. In any case, to avoid liability problems, most software contracts carry a disclaimer clause which in effect says that the supplier will not be responsible for any damage or loss caused by the use of the software 'Caveat emptor', "Let buyer beware!"
Currently therefore, most accidents involving computers in accidents can be traced to:
(a) Incorrect application of correct computer programmes;
(b) Wrong data input; or,
(c) Wrong output interpretation,
all of which could be grouped under 'software misuse'.
In particular, MFEA was and continues to be an art and a science. While the number crunching and more recently animated colour graphics are the computer's main contributions to this powerful tool, two areas are - or at least should be - still the domain of the human mind:
(i) Modelling of the problem; and,
(ii) Interpretation of the results.
Most MFEA-related failures happen because of deficiencies in these two areas.
6.2. The computer trap of over-reliance
The very speed and power of the computer are also its built-in danger. Computer users soon get sucked into its sway, and blinded by its glamour. The hardware and software gradually take over the user’s thinking, lulling him into a false sense of security and trust - until disaster hits.
Computer results can never substitute for understanding structural behaviour. The engineer should know the approximate ‘ball-park’ answer before going to the computer and must be able to distinguish an accurate solution from one that is absurd but appears precise, [6].
The algorithms of automatic input preparation and output display are the product of someone else’s brain. By now, almost everyone who uses MFEA has lost touch with what goes into input and what comes out as output. Many are using very powerful finite element packages for very complex structures and phenomena, without the least idea of the element being used, the latest theory embedded, the specific criteria applied, or the possible effect of modelling on the results.
'Wizards' those smart routines which ask for just a few numbers and clicked choices, develop an entire mesh, run the analysis, and wrap it up with nice colour graphic results - are like some voodoo magic that gets what you want fast and effortlessly, no matter that you don’t understand what, how, or why. Serious professionals would at least spot check their creations.
Until I gained enough know-how, I always tried to run every FEA problem with at least two and preferably three different meshes, not to eliminate the ‘discretisation error’ (because you really cannot eliminate it altogether) but to know the magnitude of potential error in the result so that I can extrapolate to the true value for the continuum, as in Fig. 7.
In the late 1960s, this was standard practice for serious work. Without offering FEA error bounds and an estimate of the corrected value, I doubt if the U.S. Atomic Energy Commission would have accepted my recommendations, [7]! Even now, I do not trust my own single mesh answers, unless I have similar mesh experience or available literature has equivalent results.
Today, FEA tools have become smarter. The discretisation problem is mostly pushed out by sheer brute force of too many degrees of freedom. Even so, without the awareness of modelling effects, a FEA package could be like a knife in the hands of a child.
Fig. 7. Mesh effects and extrapolation to continuum
A finding in a paper, [8] by Bella and Liepins is worth quoting (better than just my say so!): "Today's engineering graduates are well-versed in the matrix structural analysis methods that form the basis for computer analysis, but they are weak in the classical hand methods that allow approximate checks of finite element methods and develop a feel for structural behaviour."
To be fair however, we should not condemn all computer use outright. Surely, many users have studied the theory of MFEA in college and maybe even done projects with it. But once they come out of college, life takes on a different hue, getting results out ‘yesterday’ is important, and so youthful curiosity and healthy scepticism soon take a back seat.
Further, stupid mistakes can be (and have been) made even with manual calculations and graphics. The only differences with computers are that:
- The tool was developed by someone other than yourself in a process which you did not share;
- Once inside the computer, too much happens too fast;
- The entire process is invisible; and,
- You cannot check back the output and locate what went wrong and where - you can only check your input.
That is why computer applications need extra care, as some examples will illustrate.
6.2. Hartford Civic Centre Arena Collapse, 1978
Main details of this case have been covered in another paper of mine, [1]. Here I will touch only on the problems created by improper use of software by the designers.
The arena roof was a three-dimensional truss designed for the first time by a three-dimensional matrix truss analysis computer programme. Erected in 1973, it served without incident for five years. But on the night of 18 January 1978, it collapsed under a snow storm. Just by good fortune, there was no one in the arena at time of collapse, [9], (Figure 8.)
While most of the blame went to the fabricators who shifted the welded connections at the roof by a few centimetres, it was also discovered that the designers too did not realise the limitations of their computer programme, and did not heed the implications of the large deflections the roof experienced during the erection stage, blindly trusting the computer design for the strength and stability of the roof structure after erection.
The small shift of the designed connection had reduced its strength to less than a tenth in one case, and less than a third in another. Investigation after failure brought out the following limitations of the new computer programme which contributed to the collapse:
- The buckling mode of failure, to which roof design was extremely susceptible, was not considered in that particular computer analysis and thus not provided for.
- Incorporated into the computer model were some fundamental assumptions about end conditions on certain frame members, which turned out to be grossly oversimplified.
- Connection details were difficult to incorporate in the computer model.
As a combination of these factors, and the users' over-reliance on computer analysis with an imperfect model, the seriousness of the fabrication changes to the connections was not apparent.
The reactions to the collapse were swift and very far-reaching:
- It shook public confidence in space truss roofs, and even more so, in the new-fangled computer analysis of structures.
- President Gerald Ford ordered water load testing for a similar roof in Michigan.
- Engineers and architects tempered their reliance on computer models to cut down the structure to bare minimum, leaving no redundancy or margin for error.
- It focussed on the need to examine what exactly a computer program did, and how a structure needed to be modelled for computer analysis to reflect the desired design, as well as to be re-analysed when the construction/erection conditions change.
- It highlighted the need to watch out for warning signs such as deflections or deformations during the construction stage much higher than predicted in the design.
6.3. Sleipner Off-Shore Platform Collapse, 1991
Decades after Hartford, computer confusion still remained with us! The Sleipner platform to produce oil and gas in the North Sea, standing over 82m of water on a concrete gravity base structure, sprang a leak in one of the supporting cells on 24 August 1991 during erection, crashing the whole platform and causing an economic loss of $700 million, [10], (Figure 9.)
Why? After all they had used among the most popular finite element package in the world at the time, NASTRAN! Yes, but the problem with finite elements is not how good a package you have, but how good you are with any package you have!
Post-accident investigation traced the error to inaccurate finite element approximation of the linear elastic model of the supporting cells - simply put, the mesh was too coarse. The shear stresses were underestimated by 47%, leading to insufficient design. More careful FEA made after the accident, predicted that failure would occur with this design at a depth of 62m, which matched well with the actual occurrence at 65m.
Fig. 8. Hartford Civic Centre collapse
Fig. 9. Sleipner Off-shore Platform, before collapse, and schematic.
How fine a finite element mesh must be cannot be decided by the 'wizards' that accompany a modern package to supposedly make your job easier, but actually taking away your initiative.
6.4. How bad can the variations be?
The 113 page paper, [11] by Professor Emkin, Founder and Co-Director of Computer Aided Structural Engineering ('CASE') Center at GeorgiaTech is very revealing. Figure 10 shows the 67 storey reinforced concrete building analysed for forces, moments and deflections, by the following five models, and plotted results along certain lines:
Fig. 10. Emkin's RC Building analysis of computer solution accuracy
- FEA - Full Finite Element Model, with FE Floor Slabs
- RBPCD - Rigid Body Plane floor membrane, including Column axial Deformations
- RBPNCD - Rigid Body Plane floor membrane with No Column axial Deformations
- RBSCD - Rigid Body Solid floor including Column axial Deformations
- RBSNCD - Rigid Body Solid Floor with No Column axial Deformations
The exact values in the tiny pictures may not be readable. But the wild and wide horizontal swings of the plotted quantities in the ten charts from the five methods should be impressive enough about how the choice of computer model can govern results from computer MFEA.
I have highlighted (more basic) variations in computer modelling in another paper, [12].
6.5. Case Study from Geo-technical Engineering
Geo-technical engineering is particularly susceptible to computer modelling errors because soils have so many variable properties that the user needs expertise and extra care with the input parameters and the failure model chosen. The 2004 Nicoll Highway Collapse in Singapore is a prime example, (Figure 11.)
Fig. 11. Nicoll Highway wall collapse, Left - The scene, Right - PLAXIS Results
This accident has been discussed in some detail in another paper of mine, [1]. The computer problem identified here is the wrong choice of geo-technical modelling option in the PLAXIS software package used. The designers had chosen Method A, while the prevailing soil conditions required the use of Method B, which would have predicted much greater wall displacements and moments than Method A, corresponding to the values measured at site.
Figure 11, right, shows the relevant charts from the report of Committee of Inquiry, [13]. Again, all we need to compare are the spread of the curves in the top (Method A) and the bottom (Method B) charts for displacements and bending moments to see the shocking truth that choice of the wrong model under-predicted the effects by about 50%, leading to the major tragedy.
The fact that daily measured displacements were consistently and considerably more than those predicted by the design should have clued the personnel responsible to re-examine the model, but it did not.
7. SPREADSHEETS
7.1. Advantages of spreadsheets
Computer spreadsheets such as Microsoft Excel are a boon to forensic investigation, because:
- They can handle vast amounts of data - 1 million rows by 16,000 columns in Excel.
- They can exploit most of the advantages of computer data management listed earlier.
- They are fast and interactive, and ready in an instant.
- They will accept all kinds of data and can be easily programmed to arrange, sort and manipulate them, compute required answers, and display results in various tabular and graphical forms.
- The data validation aids available can help avoid entry of wrong data, and highlight inconsistent items in lists.
- They will let us change any data at any stage of analysis and immediately modify all affected quantities, as well as alter the related graphics to reflect the change made.
- They will do basic statistical analysis on data and results and display related graphics.
It suits the technique of experimenting and learning by playing around. I have had a lot of fun with it, and am using it more and more to satisfy my curiosity as well as to carry out my tasks.
Spreadsheets can also be risky, because they are so easy to use. The biggest danger is, unless there is rigorous documentation and/or locks on cells which embed formulas or conditional formatting, an accidental entry into one of these cells can wipe out the entire functionality.
Worse yet, it may let the programme continue to work, accepting wrong data into the loop due to the accidental entry, but eroding the integrity of the results, without obvious signs that something is wrong! The user must keep checking back with standard results occasionally to confirm that no accidental corruption of the programming has accidentally taken place.
7.2. Statistical Analysis
Spreadsheets like MS-Excel have a number of basic statistical functions such as Anova, Correlation, Descriptive statistics, Histogram, F-test and t-tests, and some limited regression analysis. Combined with graphical representations of data by bar-charts, pie-charts etc., I have found them quite adequate for my forensic engineering activities.
However, at a higher level, where the entire outcome of a case may depend on statistical findings and recommendations, some professionals may not be too happy with Excel's accuracy or scope, and then you would have to go to more sophisticated statistical packages like SPSS.
7.3. Parametric Studies
Frequently it would become necessary in forensic engineering to conduct parametric studies to find the effect of variations of different quantities affecting some critical outcome. Although this can be done with many general purpose analytical software, often one would have to port the results from them to a graphics software for the plotting. Even in packages which have built-in graphics, batch modes work in cycles of analysis - results - reanalysis.
Spreadsheet software like MS-Excel can not only quickly generate computed results for multiple sets of variables, but also can it simultaneously plot a variety of graphics from any required set. As and when the data changes, the graphics also changes in step, giving a very powerful interactive decision-making tool for the forensic engineer.
In a recent investigation involving the fall of a worker from a mobile scaffold resulting in a head injury, the question was raised what the height of the guard-rail should be for a worker to safely lean on in normal course of work, (Figure 12, left.) This analysis has also been mentioned in another paper, [1] and discussed in detail in my journal paper, [14].
I conducted a parameter study of the characteristics of workers of various girths falling over guard-rails of different heights, (Figure 12, right three.) The charts show for instance that with a guard-rail which happens to be 979mm high, a person leaning towards the rail at 10° and over the rail at 45°, will fall over if he is 100mm in girth, just balance on the rail if he is 200mm girth, and will be safe from falling if he is 300mm
Fig. 12. Computer Analysis for Falling Over a Guard-rail
girth, because his centroid will be outside, on, and inside the guard-rail – so fatter the better!
The point is, once I had set up the model, it was simple to incorporate the equations into the programme and come out with interactive graphics - quite thrilling to see the stick man move at my command and tell me if he will fall or not! It turned out to be a good learning experience about (not) falling over a hand-rail.
7.4. Presentation of data and results
The old adage that "one picture is worth a thousand words" can be exploited to full effect with computer graphics software, starting with bundled software like MS-Paint to sophisticated stand-alone packages like Adobe-Photoshop.
You can do wonders with simple software, even with the built-in capabilities of spreadsheets being limited to pictorial representation of data or results, which will not be useful in the creation of new images.
Spreadsheets give the facility of producing bar-charts, pie-chart, and x-y plots simultaneously with the computations, in clear and colourful formats, (Figure 13.) These are a blessing when the forensic engineer wants to present the effects of some factor varying over a range, or the distribution of some critical value among various segments of a variable.
Fig. 13. Spreadsheet graphics
Statistics presented pictorially are not only prettier but also more powerful in their immediate impact on the audience. Comparisons are easier visually through charts than by numbers.
8. COMPUTER GRAPHICS FOR HIGHLIGHTING AND MEASURING
Although the computing capacity of computers is indeed what lifted mankind to a new level of professional sophistication, it is the GUI (GraphiUser Interface) that really exploited the analytical capabilities of the machine.
Engineers from previous generations were lucky that they would have gone through a few manual drafting courses as students. Today, with plotters doing the drafting, sketching and drawing are lost arts.
In addition to my drawing courses, I even had some brief coaching in freehand sketching from an artist. But equally important has been my fascination with graphics as a skill and hobby, ending up with my writing a book on computer graphics, [15].
I soon found that the ability to sketch gave me a powerful advantage over those who did not have the facility. In my teaching, writing, and consulting on technical topics, I freely use graphics, especially 3D representations such as isometric and perspective, to great effect.
That computer graphics can be a useful tool in accident investigation and forensic activities may need no explanation. But the variety and versatility of its uses can be amazing.
8.1. Forensic proof
Photographs can be used to prove your contention or disprove the other side’s contention in a court. As already mentioned, digital pictures have to be authenticated before they can be accepted as evidence. Interestingly, old-fashioned negative-based prints (with original negatives) are still good for evidence because the molecules in the negative cannot be altered!
In one of my cases, a contractor claimed he had bound his rebars with wire at regular intervals according to regulations or standard practice. To disprove his contention, I showed a photograph of the area (Fig. 14), which clearly depicted the sparseness and irregularity of 'w' marks denoting the bindings.
The same photo also shows horizontal bars (B) links (L) and ties (S) with their centre lines marked, clearly documenting their sag, out-of-straightness, lack of verticality etc. The 'w' and centre line marks draw the viewer's eyes to the essential elements. (I had to submit the unmarked originals also to the court.) While these by themselves could not prove that the lapses led to the collapse, at least they could cast doubt on the contractor's workmanship standards.
I have used comparisons of two photographs of wire bindings on rebars (shown by light coloured circles in Fig. 15) to argue that the near-perfect conditions of a lab test did not simulate the actual conditions at the accident site. Regardless of the scale of the two photographs (which at the time I meant to represent nearly the same extent) the orderliness and closeness of bindings in the lab specimen are in stark contrast to their randomness and sparseness in the site shot.
8.2. Forensic Computation
What do you do if you need the dimension of some item, but you have only a picture of it, and no other information?
Fig. 14. Graphical enhancement of photographic content
You can use the ratio of the particular item in relation to one or more items in the picture whose dimensions are known or can be estimated within some tolerance. I have used this technique more than once, but it takes solid science and a lot of effort to convince the court.
Fig. 15. Wire binding for rebars, (Left) at site, and (Right) in lab test
Although I have included this topic under computers, forensic engineers can do it (and have done it) on an enlarged print of the photograph also, by actually measuring on the photograph in millimetres and arrive at the same findings, as they used to do in the olden days.
It is just that it is much easier to measure with computers whose images are made up of minute ‘pixel’s (picture elements), and also quite simple and fast to prepare visuals for the analysis and result presentation. I will share a couple of examples.
(a) Bent rod deflection:
In one accident a worker fell from a scaffold, and the helmet he wore (without a chin strap) had fallen first and been crushed by the falling scaffold. But the worker survived, although with a head injury. (The nearer helmet came from another worker.) I had to determine what prevented the injured worker from the fatality which should otherwise have happened according to the basic dynamics of falling, [14].
All I could find in the photograph was a bent rod forming part of the guard-rail the worker was leaning on. The worker's body might have hit the rod and bent it, absorbing much of the kinetic energy.
Figure 16 shows the top of the fallen scaffold with one helmet crushed under the far leg. The bent rod A’B, and the desired deflection are marked on it. The near end of the rod, tied by wire to the top rail (which itself was illegal) has slid down from its original position A to a stop at A’ on the vertical mid-rod.)
To check out the impact force, I needed the rod deflection δ. But the rod was long gone from the scene, and God knew what happened to it. So I went to work on the photograph on my computer. These days you get most pictures in their soft copy version and you don’t have to scan them. But my lawyers had only hard prints, so I had to scan them into my computer first. Then I noted that δ spanned 39 pixels in the photo.
Fig. 16. Deflection of rod
To find how much δ was in mm, I needed a reference length. Generally, if and when I take a photo where I do not know one or more of the dimensions, I lay a 6-inch scale (which a forensic engineer is supposed to carry around with him) or at least a ball-point pen or some personal article of mine for reference. But this was not my photo.
However, I knew the plan dimensions of the scaffold as 1.8m by 1.2m. The photo perspective decreased the apparent dimensions as the object receded farther from the camera, as shown for the 1.2m side bar decreasing from 343 pixels to 198 pixels. The 1.2m at mid-length of AB could therefore be estimated as being represented by (343+198)/2 or about 271 pixels.
Then, deflection at mid-length could be calculated as: 1.2×39/271 = 0.17m or 170mm. I was then able to confirm that the energy required to bend the rod to this deflected shape had reduced the final impact to his body, in a way 'cushioning' his fall and preventing fatal injury.
How accurate is this deflection estimate? I have taken linear behaviour for the perspective, which is not quite true: Mid-span will not be mid way between the ends in the photo - so I have not been too precise. But given time and incentive (like the need to prove more rigorously) we can get all these factors reflected in the calculation. However, for a first order estimation, my approach is adequate, and my guess is it would be correct to about 10%.
Will it stand up in court?
I knew legal holes which the other side could shoot into my argument apart from the accuracy problem, which is a common first line of criticism. How did I know whether the rod was straight in the first place? I didn’t. Of course, you ask the owner, he will swear it was straight, because otherwise it would be admission of another deficiency on his part. In any case, my finding would still be a conjecture on my part, an ‘expert opinion’, to be argued endlessly as a stroke of genius by my lawyer, and as a crude flaw by the other side.
Actually, I did not have a chance to find out. The investigation never became a case, but was settled out of court - as happens in most cases that do not require mandatory prosecution.
(b) Underpass width:
Another time, I had to find the distance between two very critical lines on a surface in a fall accident, and give my expert opinion on how it compared with distances between similar lines in various countries. The Internet had the information only for a few of them. I had to rely on pictures captured from the web to make my argument. I resorted to computer graphics.
To demonstrate the procedure I adopted, let me take the example of finding the width u of a London underpass in Fig. 17(a) - nothing to do with my case, which I cannot talk about.
Fig. 17. Estimation of the underpass width from a photo
To find u from the photo, I can try to relate it to some other known dimension in the picture, like in the previous example. Let us say we know the height of notice board is 1.1m. Then, I pass a cutting plane ABCD normal to the view line in Fig. 17(b), through the edge of the notice board, and note the pixel heights of the board and the underpass width as 195 and 587.
u = 1.1×587/195 = 3.31m, as accurate as the notice board dimension. No arguments here!
What if I did not have the height of the notice board, as could be the more common situation?
I have to use estimate of some other familiar dimension, in this case, heights of ‘average’ people in the picture. The Internet has lots of statistics on heights of people in various countries. For England, BBC gives men’s and women’s average heights as 1.75m and 1.62m, [16].
Now I measure in pixels, the heights of the nearest man and woman pair and the width of underpass, in the normal cutting plane EFGH through the couple’s feet in Fig. 17(c), as 355, 337, and 715. Then u will be [(Man’s height/355) or (Woman’s height/337)] times 715.
From this, my estimate of u = (1.75/355)×715, or = (1.62/337)×715, i.e., 3.52m or 3.44m - quite close, giving average of 3.48m, maybe round off to 3.5m. That is the best I can do for now.
My estimate from people height is in error by 100×(3.48-3.31)/[(3.48+3.31)/2), i.e. 5%.
Not bad! But then, first I was lucky to get a picture with 'average' Britishers. Then I was able to bracket the error only because I had the exact dimension of something in the same picture as the people. Without such help, all I can do is to minimise my potential error in various ways.
In such a situation involving human heights, an investigator will have to allow for the following variables. If he does not, the other side will tear his findings to pieces.
- The variation of heights from the average for the majority of people in a set may be ±5%, a range of 10%.
- The difference between average male and female heights runs between 8 and 10%. So, if you guess the gender wrongly (as is quite possible in these days of unisex dress and hairstyles, especially from the back) your results are off by another 10%.
- The person(s) in the picture may be tourists from another country with average heights much taller (Danish men/women, 184cm/171cm, about 5% more) or much shorter (Philippines men/women, 162cm/150cm, about 8% less) than the subject city’s average height. These would introduce another variation of another 13%.
Add it all up, and we end up with a maximum variation of 33%, in the worst case scenario of the investigator mistaking the shortest lady from the Philippines for the tallest man from Denmark! It is really not so bad in practice.
To minimise the error, I usually look for a picture with a group, and pick an 'average' person whose gender is fairly clear. Then I allow for a height variation of ±5% within a homogeneous group. If the person is obviously from a particular region, I make broad allowances for the height variation for that region also, such as Europeans versus Asians.
I try to find pictures with more than one person - not in the same line as I did in this example - from which I can determine the desired dimension, and take the average for different persons. In Fig. 17(c), from the man marked by the arrow I get a third u value as 3.51m, with a new average (3.52+3.44+3.51)/3, i.e. 3.49m, not too different from the previous estimate.
I also try to calibrate use a picture where I know the dimension sought or some other length, apply the pixel estimate method, and adjust my computed value for any difference for that and similar pictures the factor would take into account camera focal length, position, etc.
Ultimately, when dealing with a large number of cases, I aim at (hope for) an error potential of 10% (±5%), which is quite good (and achievable). Of course, such precision cannot be guaranteed when a single dimension in a single picture has to be estimated.
In the particular case analysed, I could have given my estimate (without knowing the board height) as 3.48±0.17m. Then the worst error from the correct 3.31m would have been 100×(3.48+0.17-3.31)/3.48, i.e. 10%, not bad. Even if I had been wrong by twice the expected deviation, i.e. ±10%, the final error would have been 100×(3.48+0.35-3.31)/3.48, i.e. 16% - still not bad, compared to when we knew nothing about it.
Beyond all these are the complexities of the graphics perspective itself. Best estimates are obtained when the camera is pointing directly along the centre line of whatever you wish to estimate, and held about waist level of people, as in the case presented. But the camera may be to one side or the other, or higher than people level, or the vista may be tilted or curved.
If there is a noticeable horizontal and/or vertical angle between the camera axis and the line perpendicular to the measurement sought, I try to apply corrections involving the angle.
The point is, you can’t give up - you must still keep pushing, putting your best foot forward!
When I use this technique, I should be prepared to satisfy the other side and the court. First, for the record, I must explain the graphics method I use, because it may not be a routine procedure. I must list the approximations I make and the resulting range of error.
I am sure to be asked how I know that the original pictures from the web had not been modified graphically to start with. I don’t, but I may explain that these were record shots (many from amateur blogs) and not promotional ones for the particular dimension in question. Here again, many and a variety of sources are the safest bet.
Will it survive in court? Nobody knows - it ultimately depends on the jury or judge!
In a similar graphical approach, I have used trajectories of moving objects by extrapolation of motion paths in individual frames of videos, to guess starting/ending points of moving objects.
Even if the technique does nothing to define the quantity, large numbers of such information on a particular item will at least show the qualitative variations, and their range – which itself may be an addition to the body of knowledge on which a jury or judge must decide an issue.
9. ANIMATIONS AND SIMULATIONS
Animation and simulation are at the top of the list of computer applications in forensic engineering. If you can do it well, you have a powerful forensic tool. Of course, even if you cannot do it yourself, it is as well if you have the ideas and can get someone else to do it well for you, as clients will be happy to bear the extra charges for additional information you produce.
9.1. Difference between animation and simulation
Both animation and simulation involve graphic images in motion, but there is a difference, even separate legal definitions, in many Western courts.
Bow Tie Law’s Blog, [17] differentiates between them as follows:
"A computer animation has the following characteristics:
- Moving pictures not intended to simulate an event;
- Authenticated by a sponsoring witness with personal knowledge of the content of the animation;
- Showing that it fairly and adequately portrays the facts; and,
- Helps illustrate the testimony.
"A computer simulation has the following characteristics:
- Scientific evidence;
- Generally detailed and realistic recreated computer image of the event that can be manipulated; and,
- Can be portrayed from different angles or from the viewpoints of different witnesses."
My work on the falling worker in Fig. 12 is an animation and not a simulation, because it depicts parametric results with assumed data, and did not simulate any specific fall. If I add motion from dynamics equations applied to specific accident data, it may qualify as a simulation.
One must have permission from the court even to show any slide or video of factual matters, and the same applies all the more seriously to imaginary visuals. Animation/simulation must be:
(i) Authentic,
(ii) Relevant;
(iii) A fair and accurate representation of the evidence to which it relates; and,
(iv) Of probative value not substantially outweighed by the danger of unfair prejudice.
At first one would go gaga at how many tricks could be done with animation tools. But it is not worth getting too cute with them, because smart lawyers and judges know what is being done. If overdone, visuals could have a negative effect, like some slide presentations do where the colours and the slide changes look (and sound) like Diwali (Festival of Lights) fireworks.
9.2. Animation
A lot of animation is used in traffic engineering cases, because that is the easiest way the audience can understand the position of various vehicles involved in the accident before, during and after the event. Before computers became so powerful, and even now in the smaller (low budget) cases, lawyers and witnesses move around coloured blocks of appropriate sizes to demonstrate different views of what each side wants to prove happened.
But today, almost all traffic cases come with nice video presentations of different scenarios, often supported by car-mounted or street-corner videos. While judges are not supposed to be influenced by the action and colour, it is difficult not to be impressed by a well made video!
With my fascination for computer graphics, I try animation whenever and wherever I could. Here I give one professional example from a litigation consultant on the internet, and a couple of my own minor examples (which I can talk about) - nothing most others cannot do:
(a) Site accident animation and physical model:
Figure 18 shows stills from a video by Z-axis Litigation, [18] of a construction worker who is pulling some planks to make a work platform and in the process falls off the work area to his death. Top row in Fig. 18 is a 3D rendering of the scene, and bottom row shows parts of the animation of the event and a physical model.
Fig. 18. Construction accident animation and physical model
His survivors claimed that the employers did not provide fall prevention safeguards, but the forensic evidence, bolstered by the animation video and a one-third size physical model, proved that the worker had failed to anchor his safety harness to the safety line which had been within his arm’s easy reach. His family did not get any compensation.
(b) Animation of rebar support problems and solutions:
In one presentation about a case, I used animation to show how one rebar rod would slide around another if laid on top of each other, as shown in Fig. 19 (left). It looks clear enough in a still picture, but when displayed actually sliding, it is a dramatic demonstration.
In the same case, the contractor had argued that he was forced to change the in-plan positioning of support frames for rebars as shown in Fig. 19, right, from (a) to (b) so as to avoid lack of support where the bars were atop the bent corners of the frame as in (a).
Irrespective of whether this aggravated the collapse potential or not, the point was that this was a deviation from the design, and I pointed out that it would have been simpler, cheaper (because of no wastage of overlap in the contractor’s solution), and better adherence to the design if he had put a cap or even tied a piece of rod over the bend gaps as in Fig. 19, right, (c).
Fig. 19. Animation of rebar support problems and solutions
For greater effect, I animated the two frames sliding from (a) to (b) to overlap each other and support the middle rebar, and then sliding back from (b) to (c) with a cap to cover the gap. This ensured that everybody in court understood the implications of the motions described.
9.3. Simulation
Computer simulation, when permitted and done well, can be very effective. It does not demand search for and acquisition of parts to build your experiments, waiting for materials, technicians, or fabrication. And if it does not work like you expected, you can simply change one or a few entries in the input and re-do the analysis almost immediately. The best part of it is that it does not hurt anybody or damage anything if the product or structure fails.
You can recreate described conditions, or present alternative configurations much easier, faster, and cheaper in a computer than by building a physical prototype as was the practice before computer graphics. The colour, texture, shadows, almost everything that can describe a real object and its surroundings may be represented. This saves a lot of time and effort in court testimony.
You can play around to your heart's content and try out even stupid-sounding ideas without anybody else being the wiser about your foolishness!
Of course, you need to confirm, or calibrate your computer model with a few prototype or model tests. Otherwise it would be just another imaginary animation.
Although personally I have simulated many events and processes both in the lab and through finite element method in my structures research and consulting, I have not had the opportunity to produce a simulation for my personal expert forensic testimony thus far. I will present a few examples from public domain.
(a) Precast Panel Fall on Worker:
Any amount of verbal description accompanied by still photographs will not bring home the trauma of an actual accident like a video can, as indicated by the four stills shown in Fig. 20.
These are from the one-minute long 3D-video simulation of a crane operator controlling the lifting of a precast wall panel, and the panel breaking away from the crane hooks and falling towards the operator. He tries to escape, but the wall catches him on the back of the right leg. [19]. The video was specially made for presentation in a court case.
Fig. 20. Operator moving precast panel, geting injured
In this simulation, all the objects are realistically digitised to scale, and the motion is implemented to represent scientific principles. The view is continuously displayed and re-played from different positions so that the forensic engineer can show the breaking away of the anchors and the fall of the wall.
(b) World Trade Center Collapse:
The World Trade Center (WTC) terrorist attack on 11 September 2001 in New York is probably the world’s most investigated disaster of modern times, taking as it did a human toll of nearly 3000 innocent lives at once. Scientific curiosity, government imperatives and the scale of the human toll demanded such an effort.
If we can look beyond the human tragedy, the numerous finite element analyses of the buildings and events involved make a beautiful application of computer technology in the representation of reality and pursuit of truth.
A most impressive one is the Purdue University version by FEA of the WTC North Tower attack, [20]. This is mostly simulation in the sense that the structure and interacting elements and processes are all scientifically accurate. But some animation has been added on to show effects such as flames and smoke. Figure 21 displays the major features of the finite element modelling.
Fig. 21. Purdue University WTC disaster finite element model
Figure 22 shows screenshots from the 5 minute video simulating the crash and animating some of its effects, as follows:
1. Title and acknowledgement
2-3. Google Earth images, plane hitting, fire starting
4. Plane entering tower
5. Plane almost fully entered, in a fraction of a second
6-8. Interior view, plane entering, crashing through
9. Plane pieces exiting from tower
10-12.Interior view, fire starting and spreading
13. Exterior view, fire spreading
14-15.Only tower columns shown, plane hitting and slicing columns
16. Rendering of dust and glass pieces
Fig. 22. FEM simulation of WTC attack, screenshots from Purdue video
The video is amazingly realistic, slowed down considerably to depict the simulated crash.
What came out of such forensics was the fact that both the North and South towers survived so long despite the loss of many supporting columns, allowing tens of thousands of occupants to rush down to safety, because the original designers had left quite a bit of redundancy in the structure in the form of steel window mullions which took over when the core columns failed.
Main recommendations made on the basis of findings were to increase the width of stairs to permit faster evacuation, and better fire protection.
10. CONCLUSION
Computers are here to stay. We need them in forensic engineering as much as in everything else, but we must be careful with them, particularly with powerful software which can make our work so much easier if used right, but which can land us in deep trouble if we misuse them.
11. ACKNOWLEDGEMENT
The author is grateful to Mr. Lim Kim Cheong of Messrs. Lim Kim Cheong Consultants for the computer services provided by them for the rebar grid collapse analysis, and to Ms. Moi Mei Ling of their staff for the input generation and running of the computer analysis for the numerous finite element models and load cases that the author devised to investigate the collapse.
12. REFERENCES
- Krishnamurthy, N., "Investigative Methods in Forensic Civil Engineering", Proceedings of the Conference and Exhibition on Forensic Civil Engineering, ACCE, 23-24 August 2013, Bangalore, India.
- Failure Analysis of Minneapolis I-35W Bridge Gusset Plates, Simulia. Retrieved July 2013:
- LaMalva, K.J., Complete Report on Failure Analysis of World Trade Center 5, 7p.Retrieved July 2013:
- Pentium FDIV bug, Wikipedia. Retrieved July 2013: http://en.wikipedia.org/wiki/Pentium_FDIV_bug
- "Error in proprietary design program", Report cros349, CROSS Newsletter, No. 31 - July 2013.Retrieved July 2013:
- Delatte, N.J.,M. and K. L. Rens, "Forensics and Case Studies in Civil Engineering Education -State of the Art, ASCE Journal of Performance of Constructed Facilities, Vol. 16, No. 3, Aug. 2002, p. 98-109.
- Krishnamurthy, N., Three-Dimensional Finite Element Analysis of Thick-Walled Vessel-Nozzle Junctions with Curved Transition, ORNL-TM-3315, Oak Ridge National Laboratory, U.S. Atomic Energy Commission, July 1971.
- Bell, G.R. and A.A. Liepins, "More Misapplications of the Finite Element Method," Forensic Engineering: Proceedings of the First Congress, (K.L. Rens, Ed.) ASCE, Minneapolis, USA, 1997, pp. 258-267.
- Johnson, R.G., Hartford Civic Center, 2009,.Retrieved July 2013:
- The sinking of the Sleipner-A offshore platform.Retrieved July 2013:
- Emkin, Leroy Z., Comparison Of Static Analysis Results Based On Different Models Of A 67 Story Commercial Building, 2002, 113 p.Retrieved July 2013:
- Krishnamurthy, N., "Safety in High-Rise Design and Construction", Seminar pre-print, published in ' Build Tech - 2006 Souvenir of International Seminar on High Rise Structures', by Builders' Association of India, Mysore Centre, Dec. 2006, p. 19-34.Retrievable from:
- "Chapter 5 – Causes of the Collapse and Findings”, Report of the Committee of Inquiry into the Incident at the MRT Circle Line Worksite that led to the Collapse of Nicoll Highway on 20 April 2004, Figures 5.2, 5.3.
- Krishnamurthy, N., "Worker fall from mobile scaffold", Int. J. Forensic Engineering, Vol. 1, No. 1, 2012, p. 21-46.
- Krishnamurthy, N., Introduction to Computer Graphics. Tata-McGraw-Hill Publishing Co. Ltd., New Delhi, India, 2002, 343p.
- Statistics reveal Britain's 'Mr and Mrs Average',BBC
- Computer Animations vs. Simulations: What is the Difference? Retrieved Jul 2013:
- Construction Site Accident Animation and Physical Model, Z-Axis Litigation.Retrieved July 2013:
- Construction Accident Recreation.Retrieved July 2013:
- Purdue creates scientifically based animation of 9/11 attack, Purdue University News, 12 June 2007, 4p.Retrieved July 2013:
Source Credit: The above article by Dr. N Krishnamurthy was presented during the recently held Forensic Civil Engineering seminar, FCE 13 at Bangalore, organised by ACCE (I).