edge-plm-st8-cae-femap-teamcenterSlide thumbnail

- 3D Design
- Hybrid modelling
- 2D & 3D Drafting
- Rendering & Visualisation

CAD Software

CAD Software

Click to learn more...

edge-plm-st8-cae-femap-teamcenterSlide thumbnail

- Turning & Milling
- Multi axis machining
- Nesting Sheet Metal
- Post processors

CAM Software

CAM Software

Click to learn more....

edge-plm-st8-cae-femap-teamcenterSlide thumbnail

- Structural, Motion and Dynamic Analysis
- Simulation Data Management
- Flow Simulation
- Thermal Analysis
- Acoustic Simulation
- Multi-Physics Simulation
- Control System Simulation

CAE Simulation and Analysis

CAE Simulation and Analysis

Click to learn more....

edge-plm-st8-cae-femap-teamcenterSlide thumbnail

- Engineering Change Order Management
- Engineering data vaulting
- Document management
- Technical publications
- Manage multi-CAD

Product Data Management

Product Data Management

Click to learn more....

Transform your digital design and realize innovation

In a world of smart, connected products, where entire markets can vanish with a single innovation, manufacturers must take a new approach to business.

Some closely watch how products are being used, and feed data back from product utilization into product ideation and development in order to anticipate trends.
But even if you know what to make, you still have to make it. That’s why manufacturing – the realization phase of innovation – is vital in this new era.

Manufacturers must weave a digital thread through ideation, realization and utilization. It’s not enough to digitize. That just mimics processes digitally for incremental improvement.

You have to digitalize. Digitalization makes the digital thread of knowledge a proactive agent in driving your business. With a fully optimized “Digital Enterprise,” you are better equipped to initiate or respond to disruptive innovation.

To help you design smarter, we’re building a “Smart Innovation Portfolio” that delivers:

  • Engaged users who receive the right information at the right time – by transforming information so that only what’s relevant is delivered in a context suited to each person’s role.

  • Intelligent models that evolve throughout the process with the information necessary to optimize themselves for how they need to be built and how they should perform.

  • Realized products that achieve business goals through the integration of virtual product definition and real production execution.

  • An adaptive system that helps you efficiently deploy solutions today, while maintaining future flexibility.

EDGE plm software, Product Lifecycle Management specialists can help you with:


Hybrid CAD - Solid Edge

Solid Edge is a portfolio of affordable, easy-to-use software tools that address all aspects of the product development process- 3D design, simulation, manufacturing, design management, thanks to a growing ecosystem of apps.
Solid Edge combines the speed and simplicity of direct modeling with the flexibility & control of parametric design- made possible with synchronous technology.

sold edge st version


CAE Simulation and Analysis

Simcenter solutions deliver a unified, scalable, open and extensible environment for 3D CAE with connections to design, 1D simulation, test, and data management. Simcenter speeds the simulation process by combining best-in-class geometry editing, associative simulation modelling and multi-discipline solutions embedded with industry expertise.
simcenter australia


Machining- NX CAM Express

CAM Express is a modular, flexible configuration of numerical control (NC) programming solutions that allows you to maximize the value of your investments in the latest, most efficient and most capable machine tools. Easy to deploy and easy to learn, CAM Express provides powerful NC programming offering machining capabilities from 2.5 to 5 axis.
doodleincaustraliacomau3


PDM- Teamcenter Rapid Start

Teamcenter Rapid Start delivers the world’s most widely implemented product data management (PDM) solution, Teamcenter, preconfigured to utilize the industry best practices and the expertise of Siemens PLM Software. By choosing Teamcenter Rapid Start’s preconfigured capabilities, you minimize the costs of consulting & deployment, and get started with PDM quickly & cost-effectively.

Learn more about EDGE plm software:

EDGE plm software is a privately owned Australian provider of software solutions aimed at the Engineering and Manufacturing sectors. EDGE has been providing engineering design centric solutions since 2004 with over 500 customers across Australia and New Zealand. Typical solutions from EDGE would include the provision of software, maintenance, support, consulting and training services.

The EDGE software portfolio includes CAD, CAM, FEA & PDM solutions and EDGE fully supports and offers training and mentoring services on its entire portfolio. EDGE has been a business partner of UGS/Siemens since 2004. EDGE also configures and sells Dell hardware to assist our customers maximise their software investments. Read more about us…

Our Location

Our HQ Address:

EDGE plm software
16 / 94-102 Keys Rd
Moorabbin, Victoria 3189
Australia

Free Phone: 1300 883 653
Local Phone: 03 9532 0700
Fax: 03 9532 0788
Email: [email protected]

Talk to us now!

Ready to start designing better? Our gurus are standing by.

We can help help you to design better & faster in ways you never thought possible.
Talk to us now!

We offer a comprehensive product line

We tailor the right solution at the right budget

  • EDGE plm software is a solution provider of engineering design software and services.

  • We offer the full Siemens Mainstream Engineering product range, including Solid Edge, Femap, NX CAM & Teamcenter.

  • EDGE has been a selling and supporting PLM solutions for offer 11 years.

  • We have many years of training and mentoring experience.

  • We support Siemens Mainstream Engineering range in Australia & New Zealand.

  • We offer a free support help line for all users trialling Solid Edge or Femap.

  • We offer regular free training webinars and hints and tips sessions, plus regular newsletters.

  • We can tailor your training needs to your specific requirements either at your facility or ours.

  • Enjoy a hassle free support system, with a dedicated expert team. We’re ready to help!

Download a full trial here

Download a full 45 days Solid Edge/ Femap Trial

We believe you won't look back after you try it yourself.
Download a full trial here

Introducing Solid Edge ST: Re-imagine what’s possible

Build in Prototypes ways you never thought possible.

Your CAD software should help you work smarter, not harder. That’s why there’s Solid Edge, a hybrid 2D/3D CAD system that uses synchronous technology that finally frees you of the limitations of your traditional CAD software.

Want to edit dimensions easily? Re-use imported data without the hassle? They’re easy to do and more with Solid Edge. Solid Edge leverages synchronous technology, enabling your company to deliver breakthrough designs. Designers can accelerate model creation without engaging in design preplanning. They also are able to perform faster ECO edits by eliminating model regeneration, while increasing the re-use of imported 2D or 3D data.

Solid Edge has proven successful in helping companies reduce engineering costs through better re-use of 2D and 3D data. Imported assembly layouts can drive 3D product design where interference checking can solve fit and position problems before manufacturing. Synchronous technology can edit imported 3D models, reducing the need for redesign.

Feel free to contact us:

With Solid Edge you can build entire 3D digital prototypes and optimize your designs before production. You can design assemblies with machined, cast or stylized components and leverage process-specific applications to simplify frame, piping, tube, wiring, weldement, and mold tooling design.

The Solid Edge user interface removes the need for unnecessary decisions. Logical inference engines recommend next steps and intuitively consider affected geometry. The SmartStep ribbon bar guides you through the feature creation process, presenting design decisions in a logical sequence, letting you easily review and change decisions to optimize your designs.

Solid Edge offers a full suite of tools that let designers author, edit, distribute, and explore design alternatives. Engineering teams can package design and supporting data into a compact collaboration file, facilitating fast design iteration. Using XpresReview, a free, downloadable viewer, files can easily be shared with internal teams, vendors, and customers.

Read the latest news from our blog:

Simcenter Testlab 18: Boost efficiency with interactive analysis

2018-11-05 15_57_49-.pngSimcenter Testlab Interactive Analysis

 

The most precious resource for a test engineer is often his time.

 

Imagine yourself in a situation where you need to:

  • quickly investigate an interesting phenomenon
  • rapidly process data
  • process channels differently
  • clean measured data

All those seemingly quick tasks can rapidly consume your working time, as it is easier said than done. If you dream of a tool that could help you perform fast analyses efficiently, at any time and in a dynamic way, look no further. The Interactive Analysis functionality in Simcenter Testlab Neo was made for you. 

 

The key to efficiency is flexible processing. This Interactive Analysis functionality can help you process acquired data more rapidly in order to proactively steer investigations in the right direction, helping you reduce the time spent finding the causes of problems.

 

Let me walk you through a couple of examples:

 

Say you need to differentiate channels. While measuring displacements with a string potentiometer, you obtain traces expressed in distance units, but you need to know the velocity instead. Just select the channels that you would like to differentiate, click the “differentiate processing” button and the results will be readily available.

 

Do you need to perform multiple calculations in a sequence?

 

Watch in the video below how I perform a low-pass filtering, followed by a scaling process (which multiplies the trace by a constant, changing the y-axis values), while still having the option to fine-tune processing parameters. Easy, right?

 

Now, let’s imagine that during a measurement some artefacts appear in the measured data—for instance, unwanted spikes in a measurement trace, caused by your test vehicle accidentally driving on a road bump. How would you remove them? A simple time editing process would help in cleaning your data by removing unwanted spikes manually.

 

Now, let me explain how the Interactive Analysis can add even more flexibility to the Simcenter Testlab Process Designer functionality. When we use Process Designer, a complete set of channels is typically processed as a run. In that case, the same processing is applied to all channels in the set. The Interactive Analysis method makes it possible to define a different processing function for each channel separately or to perform some extra processing on a few selected channels. Note that the  applied processing  can be a simple calculation or a sequence of calculations (as in the previous example, a low-pass filtering followed by a scaling processing).

 

How do you do that?

 

First, create an Interactive Analysis method, then double-click on it to find the Interactive Analysis user interface. Select a channel from the channel set, and finally, apply the extra processing on that channel only.

 

Can processes built in Simcenter Testlab Process Designer run in Simcenter Testlab Interactive Analysis? Yes, they can. Both processing applications work hand in hand.

 

Discover the Simcenter Testlab Interactive Analysis capabilities by yourself in Simcenter Testlab 18.

How software architecture can secure your software development

This article is in continuation of my previous writing on the importance of software architecture and how it can potentially help you avoid Jenga lifecycle. If you have not checked that one out yet, please take a look here.

GettyImages-1044801042.jpg 

 

Now coming to this blog and as you may have guessed already from the title of the blog, I am gonna be talking a bit about rock climbing and the act of belaying and try to map the same principle to our embedded software development lifecycle. So get into your harnesses and read on.

 

As someone who enjoys outdoors, I find climbing quite fascinating due to its technicality and problem-solving aspects. Let me briefly explain the basics of sport climbing here. Sport climbing relies on fixed bolts for protection along a predefined route. The climber ascends the route with the rope tied to his or her harness and clips into each bolt or quickdraw to protect against a fall. 

 

GettyImages-78058279.jpgA belayer

A climbing partner (belayer) typically applies tension at the other end of the rope whenever the climber is not moving, and removes the tension from the rope whenever the climber needs more rope to continue climbing. 

It is important for the belayer to closely monitor the climber's situation, as the belayer's role is crucial to the climber's safety. Too much slack on the rope increases the distance of a possible fall, but too little slack on the rope may prevent the climber from moving up the rock.

 

Ok, enough on the climbing but where am I going with this? Imagine the whole climbing route as your software coding process, each bolt or quickdraw is a completion of a sprint or a particular task and the belayer as a software architecture. A software architecture must play the same role in software development life-cycle (SDLC) as the belayer does in climbing, which is to control and monitor the development process against a possible fall or roadblocks.

 

 

Now, there may be two main reasons why your belaying is not proper when it comes to software implementation:

 

  1. You do not make use of the software architecture at all and directly jump into coding process upon receiving the requirements and specification documents
  2. You do use software architecture to plan your development activities in the beginning but as you go ahead it becomes hard to synchronize development and architecture together due to no or very loose links between the two

 

 

The first case can be considered as climbing without having a belayer at all and may have bad consequences when something goes wrong, may result in time and budget overrun to complete the project. The second case comes close in that sense where a software architecture helps to gain a good confidence at the beginning of project but as you move up along the SDLC and as the number of lines code increases you start to lose confidence, this may be because your software architecture is not good enough to incorporate incoming changes or that the link between architecture and coding was not adequate enough.

 

This brings us to Simcenter Embedded Software Designer. Simcenter Embedded Software Designer lets you do more with your architecture models and enable your distributed software engineering teams to use whatever best-in-class tools they prefer for development. 

 breaking_down_development_and_testing_barriers.pngSolution: Architectures leveraging contracts to break down development and testing barriers

You can generate a rich implementation template directly from software architecture for C-language or Simulink models for efficiently supporting the software and controls engineering activities. It allows you to efficiently work with legacy projects thanks to a set of powerful features which guide you in migrating legacy-code to model. Integration and contract validation capabilities allow you to get your system configured right the first time. Let me explain each of these three a bit in detail:

 

Application Template Generation

 

Simcenter Embedded Software Designer helps you generate rich implementation template for C-language or Simulink models for efficiently supporting the software and controls engineering activities. In the case of external C implementationtemplategeneration.pngprogramming, the templates come in form of C code and header files containing all required functions whereas, in the case of external Simulink implementation, the template comes in the form of a Simulink model representing the blocks, ports and connections. In both cases, templates are strongly enriched and linked to architecture elements to allow change management and efficient integration.

 

Legacy Code to Model Conversion
legacycode2.png

 

You can directly work with your legacy projects thanks to a set of powerful features which guide you in migrating legacy-code to model. Re-using legacy software increases overall software dependability as they have already been tried and tested in working systems and any design and implementation faults are fixed. Simcenter Embedded Software Designer helps you efficiently analyze, instrument and extract required software content from the legacy C project and convert into a model for further reconciliation, verification and validation.

 

Software Integration and Validation

 

 

last_full_esd_image.pngHaving a very tight integration between software architecture and implementation activities helps users optimize and get their systems configured right the first time, thanks to easy integration and validation based on contracts. The shipped code may or may not adhere to the interface specifications provided in the implementation templates. The first integration step is, therefore, an automated check whether the supplied code adheres to the interfaces. Simcenter Embedded Software Designer clearly points out integration issues arising from broken interfaces.

 

So, next time when you start a software project make sure your belay game is dead strong as it will ease out your complex software development project.

 

Simcenter Embedded Software Designer offers perfectly coordinated solutions for a holistic software development: rich software architecture is used as a central platform to efficiently support the entire value chain for embedded software development and using the design-by-contract methodology to support integrated model-based software engineering. With a contract-based architecture design approach, Simcenter Embedded Software Designer allows for the development of complex software systems with architecture analysis, frontloading of test and verification, closed-loop simulation and interoperability with other development tools and platforms.

 

Want to learn more about Simcenter Embedded Software Designer? Discover other blog posts

 

Simulink is a registered trademark of The MathWorks, Inc.

The beauty of optimization: what ears and mufflers have in common

Beethoven’s life was marked by the tragedy of his deafness, for which symptoms appeared in his latest twenties. Before being completely deaf, he used ear-horns to improve his hearing condition, tools that visitors can see at the Beethoven’s museum in his native Bonn [1]. Unfortunately, his problem had most likely a severe pathological origin and such tools did not help him significantly throughout the years [2]. The first record of ear-horns usage dates back to 1634 when the French Jesuit Jean Leurechon describes them in his writings. Their main scope is to convey acoustic waves towards the ear but they disappeared around the 60thies in favour of more practical medical devices.  

 

Now, if you are familiar with natural evolution, you are maybe wondering why we did not end up having cone-shaped ears to enhance our hearing capabilities? At the end of the day, the ear function is to make us hear. The problem is that ear-horns work well only if the person points them towards the sound source: a cone-shaped ear would have prevented us to individuate the provenance of sounds, quite dangerous even for our survival. Another aspect tied to the human ears, is their ability to better capture sounds at frequencies typical of the human voice. Basically, we have human ears to listen to the human voice. All in all, natural evolution had to optimize the ear topology, a process that took millions of years, to satisfy several criteria and objectives without compromising any aspect.  

 

Speaking of which, what is sound? Sound as we know it, becomes fascinating because of the way our human brain processes it. In reality, sound is much more banal and it is nothing but a mechanical vibration propagating in a medium, obeying to all the laws of nature related to waves. Sound is thus subjected to reflection, refraction and diffraction. Such capricious phenomena are of course a nightmare when dealing with engineering design. The first example that comes to my mind are the acoustic waves produced by the heat-release fluctuations in a gas turbine flame. These waves can travel in a combustion chamber and perturb the flame. Even worse, they can amplify and make combustion unstable. If you are a gas turbine engineer, you have all my sympathy. 

 

If you are a muffler engineer, your life won’t be easy either. The muffler is a device at the very end of the powertrain, which suppresses the acoustic waves coming from the engine. The acoustic loss is triggered by abruptly changing the area of internal ducts (expansion chambers), creating reflected waves that can cancel out the incoming waves, or by absorbing the acoustic energy, for example by means perforations in the ducts that connect with a volume filled with a porous material. As usual in engineering, “you never get something for nothing”. There are two pieces of bad news in this scenario: the first is that the muffler presence increases the backpressure, thus decreasing the overall efficiency of the engine. The second bad news is that increasing transmission loss and decreasing backpressure in a muffler are concurrent objectives. I told you: muffler engineer life is not easy. But in this blog, we’ll try to crack the code anyway.

Workflow.png                                                      Figure 1: the optimization workflow

 

Optimizing the muffler design during manufacturing is quite cumbersome. In a normal manufacturing processing, engineers do not optimize backpressure and noise reduction at the same time. In the current industrial processes, backpressure is reduced through geometry optimization; noise reduction targets are verified only in a second phase and the whole process strongly relies on the engineer’s experience. It is time consuming and engineers can build and test only a few designs. That is why a combined approach is more convenient and highly desirable. In this blog we will introduce you to a new approach combining HEEDS (Simcenter optimization tool), Simcenter 3D to deal with the acoustic part, and Simcenter STAR-CCM+ to provide the backpressure prediction. Figure 1 provides the example of the workflow. The optimization is performed in HEEDS with the algorithm SHERPA, which selects a set of geometrical parameters at the beginning of the iteration. Such set is read by the CAD modeller NX, which builds a parasolid model that can be read by both Simcenter STAR-CCM+ and Simcenter 3D. Simcenter 3D calculates the transmission loss, defined as the ratio between the acoustic power at the inlet and the one at the outlet, as a function of the inlet noise frequency. Of course, SHERPA will try to maximize the transmission loss, as higher values translate into a larger noise reduction. At each iteration, a coefficient CF is calculated to measure the distance between the transmission loss-frequency curve from the baseline one. When both Simcenter STAR-CCM+ and Simcenter 3D provide the calculation response, SHERPA decides on the new set of parameters that can satisfy the objective.  

 

Pareto_Front_1_Modified.png

                                                      Figure 2: Pareto front of the optimization study.

 

I admit, as CFD engineers we might get a little snobbish when looking at the only apparent unstrictness of economy laws. Yet, to crack the code, we will use a concept directly from the Italian economist Vilfredo Pareto. Apart from the 80/20 rule, he also gave life to the Pareto front, useful to individuate the concurrency between the objectives. Figure 2 shows the Pareto front for this study, function of the backpressure and CF, with the light blue area showing unfeasible designs. It is possible to distinguish three different regions. Region 1 shows a clear Pareto front tendency when two objectives are concurrent: in this case the higher the transmission loss, the higher the backpressure. Region 2 is a region of low backpressure whilst Region 3 is a not much populated region. It means that SHERPA focused solely on Region 1 and 2.  

 

3a_and_3b.pngFigure 3a (left): Baseline; Figure 3b (center): from Region 2; Figure 3c (right): from Region 1; 

 

3da_and_3db.png                                                      Figure 3da (left) and 3db: both from Region 3;  

 

 

Little golden rule: SHERPA is a fantastic algorithm, but the critical analysis of the results from the human engineer is still necessary. Figure 3 and its subfigures aim to understand the results. Figure 3a shows the baseline design, in which you can notice distinct three chambers in which the flow is forced to pass through. Figure 3b is representative of Region 2, a region with low backpressure. As you can see, the internal chambers collapsed together, with the gas flowing directly from the inlet to the outlet. Although for SHERPA this is a feasible design, in reality this is not a real muffler configuration. This can suggest that we have a physical limit in decreasing further the backpressure of our baseline. Therefore, Region 2 is discarded. Figure 3c comes from Region 1. In this region all the designs are valid designs, with the typical three chambers structure for a muffler. As said above, in this region we are not able to increase the transmission loss and decrease the backpressure at the same time. Figures 3d are more interesting: they come from Region 2 in which we can observe both valid (Figure 3da) and non-valid (Figure 3db) designs. More interestingly, this region is placed in backpressure values below the baseline design, with a good potential for improvement. However, Region 2 is scarcely populated and yet, it’s exactly there that our best designs will be found. 

 Pareto_Front_2_Modified.pngFigure 4: Pareto front of the detailed study. Boundaries for backpressure are set within 4100 and 3200 Pa.

Figures_Together.png

Figure 5a (left): Baseline; Figure 5b (center): Backpressure reduced by 9.7%. CF=1.09; Figure 5c (right): Backpressure reduced by 6.8%. CF=1.70. 

 

SHERPA allows you to be a perfectionist. The improved design of Figure 3da is only the starting point of our analysis. We start to see the light at the end of the tunnel, but in my opinion, we did not crack the code yet. From the first optimization study, we observe that at very low backpressure we do not obtain valid designs. For this reason, we are now asking to SHERPA to build designs in a region in which backpressure has a lower and upper limit. The resulting Pareto front is represented in Figure 4. The designs in the red area are all valid designs, with both an improvement in backpressure and the coefficient CF, actually a very good news for our target. Ready to elect the best design?! We actually have two good candidates, highlighted in Figure 4 as A and B. They present very similar geometries. Depending on the final design requirements, the engineer will decide whether to manufacture design A (with an improvement of 9.7% of the backpressure) or design B (with an improvement of 6.8% of the backpressure, but with a higher CF). Designs A and B are reported in Figure 5, along with the baseline. As you can notice, the geometries are quite similar: in other words, the variations obtained on the final targets, are actually the result of very tiny geometrical variations. This can also be observed with one my favourite HEEDS tools: the parallel plot (Figure 6). This plot allows you to individuate designs that provide the desired responses and to see the boundaries within which their geometrical parameters vary.  In the x-axis, we plotted the backpressure, the coefficient CF and the geometrical quotas varied by SHERPA during the studies. We highlighted in blue only the designs falling in the valid Pareto front of Figure 4. These designs bundle together and their geometrical quotas are varying in a very restrained interval, suggesting that all these designs are indeed quite similar. Yet, they have a strong impact in the final response.  

 Parallel_Plot_New_Design_Modified.png

Figure 6: Parallel plot for the detailed analysis. a* are geometrical variables that were modified during the studies. Results are filtered for backpressure (3200 Pa <backpressure< 4100 Pa) and CF (0 <CF< 5).

 

finish the blog with one open question: our hearing capabilities took more than hundred millions of years to evolve. Do you think that SHERPA would have reduced the time to optimize the ear topology? I am firmly convinced that the answer is yes! 

 

References

 

[1] Beethoven's ear-horns in Bonn

[2] Beethoven's loss of hearing

A smart guide: Step-by-step Durability Engineering based on test data

The durability of a product is important, whether you develop an electric car or a new autonomous vehicle. Can you imagine an autonomous vehicle falling apart right after driving over a single bump? It can happen that the focus falls on the development of the latest technology rather than the engineering of essential aspects, such as durability. To develop a successful product that meets customers’ expectations, long-term quality and durability need to stay under the spotlight.

 

How to ensure that your developed vehicle will be durable? How to start?

Is it all about executing vehicle durability testing while driving on rough roads before presenting the vehicle to the market? Or is it about performing a CAE-based durability analysis upfront and eliminating the red spots with preventive actions? Well, actually, it requires a bit more. We definitely need durability tests that rely on road load data measurements and processing to guarantee strength efficiently (customer correlated and multi-attribute balanced). Let’s take a look at the entire durability process, based on road loads.

 

 

Durability infographics.pngDurability engineering process based on road load data

 

Step 1: Instrumentation

The vehicle durability testing process starts in the office. The simulation engineer or test engineer defines a list of all points on the vehicle for which he/she would like to have real-life loads. With this list, the technician goes to a lab where he instruments a vehicle with all the sensors required to measure road load data; such as accelerometers, strain gages, displacement sensors, WFTs, etc.

 

02_Instrumentation.jpgMeasuring the loads that act from the suspension onto body

  

Step 2: Test setup

When the vehicle is completely instrumented, the test engineer connects all sensors to the measurement system and defines the channels in the acquisition software by filling out the parameters for each respective sensor.

 

03_Sensor Database lookup.jpgSimcenter Testlab: Durability testing – Sensor database lookup

 

Step 3: Measure and Validate

After that, a durability test driver takes the vehicle and goes to the proving ground, where he drives with it over different durability tracks in order to simulate real-life usage.

As a next step, the test engineer looks at the data to validate whether they are O.K. before going back to the office.

 

 

04_Online processing (inside the car).jpgA test engineer validates the acquired data thanks to Simcenter Testlab Online Processing feature

 

Step 4: Analysis

The durability test engineer needs to clean the acquired data by removing/correcting anomalies, such as spikes, drifts, and offsets, before sharing the data with other teams.

Once the data is clean, load and fatigue analysis can be conducted. This can be done with either counting methods, such as Rainflow, Range Pair, Time at Level, Level Crossing; or Stress Life and Strain Life experimental fatigue analysis for a better understanding of operational loads.

 

05_Load data analysis.jpgA test engineer gains understanding about operational loads thanks to Simcenter Testlab Load & Fatigue Analysis solution

 

Very often, the durability engineer needs to minimize costs or long test durations by creating accelerated life testing profiles for test benches. This new profile can either be random time signals or block cycles, depending on the capabilities of the test benches. It is also vital to keep the same damage by comparing the original and the shortened profile.

 

Step 5: Report and Share

Finally, the durability test engineer can process the acquired data and share it with the simulation engineer and physical validation engineer. The simulation engineer can now perform a CAE analysis on the component, system or vehicle level. Test engineer can also perform rig testing or proving ground testing as part of the physical testing. In addition to data sharing, reports will enable you to gain a better understanding about the acquired data at first sight.

 

 

06_Reporting.jpgSimcenter Testlab: Quicker delivery by embedding complete test data in actionable MS Office-based reports

 

To summarize, executing simulation and psychical validation are duties of other groups within the ‘durability world’. These steps—from the moment the test engineer connects the sensors and the measurement system in the lab and sets up a channel list, and up to reporting and sharing—are part of Test-based Durability engineering.

Discover how Siemens PLM can help execute all these steps in an easier, faster and more advanced way.

END-TO-END DURABILITY TESTING VIDEO

See this website for more information.

Talk to us now!

Ready to start designing better? Our gurus are standing by.

We can help help you to design better & faster in ways you never thought possible.
Talk to us now!