Engineering Design Optimization

Explore top LinkedIn content from expert professionals.

Summary

Engineering design optimization is the process of using mathematical models and algorithms to improve products or systems so they meet specific goals, like reducing weight, saving materials, or boosting performance. New advances bring together AI, simulation, and creative design approaches to tackle both small and large-scale engineering challenges.

  • Bridge design gaps: Try embedding tunable parameters in simple models and use simulations to refine them for more realistic, high-quality results.
  • Adopt generative tools: Let AI-powered software suggest multiple innovative solutions based on your design goals, which can help you find lighter, stronger, and more efficient options.
  • Streamline with data: Integrate real engineering data into your optimization pipeline, allowing models to learn directly from shapes and performance outcomes for faster, smarter product development.
Summarized by AI based on LinkedIn member posts
  • View profile for Jousef Murad
    Jousef Murad Jousef Murad is an Influencer

    CEO & Lead Engineer @ APEX 📈 Drive Business Growth With Intelligent AI Automations - for B2B Businesses & Agencies | Mechanical Engineer 🚀

    181,647 followers

    Traditional surrogate-based design optimization (SBDO) is hitting a wall, especially with high-dimensional, complex designs. In this new paper, Dr. Namwoo Kang presents a next-gen framework using generative AI, integrating three key models: - Generative model (design synthesis) - Predictive model (performance estimation) - Optimization model (iterative or generative) Rather than optimizing directly in a high-dimensional design space (x), the workflow introduces a low-dimensional latent space (z) learned via generative models. ➡️ z → x → y z = latent variables x = CAD geometry y = performance (drag, stress, etc.) This means we’re no longer hand-coding design parameters or doing trial-and-error with simplified surrogate models. 🧠 Why this matters: - Parametric modeling is no longer a bottleneck - Complex shapes are learned directly from CAD - Dynamic and multimodal performance data (1D, 2D, 3D) can be used - Near real-time optimization is possible #AI #GenerativeDesign #CAE #DesignOptimization

  • View profile for Can Li

    Assistant Professor at Purdue University

    2,308 followers

    🎯 How can we use a low-fidelity optimization model to achieve similar performance to a high-fidelity model? Many decision-making algorithms can be viewed as tuning a low-fidelity model within a high-fidelity simulator to achieve improved performance. A great example comes from Cost Function Approximations (CFAs) by Warren Powell. CFAs embed tunable parameters, such as cost coefficients, into a simplified, deterministic model. These parameters are then refined by optimizing performance in a high-fidelity stochastic simulator, either via derivative-free or gradient-based methods. A similar philosophy appears in optimal control, where controllers are tuned using simulation optimization. ⚙️ Inspired by this paradigm, my student Asha Ramanujam recently developed the PAMSO algorithm. PAMSO—Parametric Autotuning for Multi-Timescale Optimization—tackles complex systems that operate across multiple timescales: High-level decision layer: makes strategic decisions (e.g., planning, design). Low-level decision layer: takes high-level inputs, makes detailed operating decisions (e.g., scheduling), applies detailed constraints and uncertainties, and computes the true objective. However, one-way top-down communication between layers often results in infeasibility or poor solutions due to mismatches between the high-level and the detailed low-level operating models. 💡 PAMSO augments the high-level model with tunable parameters that serve as a proxy for the complex physics and uncertainties embedded in the low-level model. Instead of attempting to jointly solve both levels, we fix the hierarchical structure: the high-level layer makes planning or design decisions, and then passes them down to the low-level scheduling or operational layer, which acts as a high-fidelity simulator. We treat this top-down hierarchy as a black box: The inputs are the tunable parameters embedded in the high-level model. The output is the overall objective value after the low-level simulator evaluates feasibility and performance. By optimizing these parameters using derivative-free methods, PAMSO is able to steer the entire system toward high-quality, feasible solutions. 🚀 Bonus: Transfer Learning! If these parameters are designed to be problem-size invariant, they can be tuned on smaller problem instances and transferred to solve larger-scale problems with minimal extra effort. ⚙️ Case studies demonstrate PAMSO’s scalability and effectiveness in generating good, feasible solutions: ✅ A MINLP model for integrated design and scheduling in a resource-task network with ~67,000 variables ✅ A massive MILP model for integrated planning and scheduling of electrified chemical plants and renewable energy with ~26 million variables Even solving the LP relaxation of these problems is beyond memory limits, and their structure is not easily decomposable for optimization techniques. https://lnkd.in/gDfcvDaZ

  • View profile for Andrew Ning

    Professor at Brigham Young University | Joint Appointment at National Renewable Energy Laboratory

    2,147 followers

    Geometrically exact beam models are particularly useful for highly flexible structures like wind turbine blades and many aircraft applications. However, using them efficiently in design optimization applications poses some challenges. Taylor McDonnell just published a new model that reformulates the equations in a way that is more efficient and robust, smooths out singularities to allow for effective design iterations, and efficiently computes derivatives, via adjoints, for steady and unsteady problems. Journal paper: https://lnkd.in/gJJ2evZv Open-source code: https://lnkd.in/g65-hUfx Taylor was a rock star in the lab, developing all sorts of well documented code packages, and making advances in optimization and aeroelasticity. Excited to see him continue his career at AeroVironment #optimization, #adjoint, #aeroelasticity #design #windenergy #aeronautics

  • View profile for Raymundo Arroyave

    Professor at Texas A&M University

    4,516 followers

    🔍 New Preprint — Good Enough is Better: Rethinking Optimality in Alloy Design In 'academic' materials discovery, we’re often trained to chase the “optimal” alloy—the one that perfectly balances all competing objectives. But here’s the uncomfortable truth: we can never actually prove we’ve found the optimal solution. The search space is enormous, the ground truth is unknowable, and what appears “best” today may fail tomorrow if it misses even a single hard performance constraint. More importantly, most realistic materials development applications are constraint-dominated, in that the performance requirements are so stringent that we can call it a success if we just find one material that works, regardless if they are indeed the "best material" for the job. Feasibility, on the other hand, is knowable. It is binary (the material works, or it doesn't), auditable (we can test each material against each performance constraint), and is aligned with how real engineering decisions are made (in practice, designers rarely need the “best” material—they just need one that works). In other words, a material that satisfies all performance requirements is good enough—and good enough gets deployed. 📈 In our latest study (with Cayden Maguire, Chris Hardcastle, Trevor Hastings, Brent Vela), we benchmarked Bayesian constraint satisfaction vs. multi-objective constraint-aware Bayesian optimization in a realistic alloy design problem with 40,000+ candidates and less than 0.05% of them being feasible. Key findings: ✔️ Constraint satisfaction finds viable alloys much faster ✔️ It recovers all feasible alloys (solving a true needle-in-a-haystack problem) ✔️ Optimization, even when constraint-aware, often struggles to find even one feasible candidate ✔️ Feasibility-first strategies reduce risk and compress timelines ✔️ Optimization is still valuable—but only after feasibility is secured 💡 Takeaway: In real-world materials design, feasibility should come first. It’s the only thing we can test and certify—unlike optimality, which we can only approximate––in early-stage discovery, verified feasibility beats notional optimality every time. 📄 Preprint: https://lnkd.in/ektQRq22 🔬 Code: https://lnkd.in/e4gW_tV3 Grateful for support from ARPA-E, ARL/HTMDEC, NSF, and TAMU HPRC. #MaterialsScience #BayesianOptimization #ConstraintSatisfaction #AlloyDesign #MachineLearning #SelfDrivingLabs #ICME #MLForScience #ARPAE #HTMDEC #ActaMaterialia #DigitalDiscovery

  • View profile for Santi Adavani

    Building AI for the Physical World

    6,073 followers

    🔬 Engineering design synthesis is moving from manual iteration to automated, data-driven approaches. MIT researchers map out how deep generative models are enabling this shift, with important technical implications for how we develop products in the reference paper below. The paper provides a systematic analysis across: 🎯 Design Problems: • Topology optimization • Materials & microstructure design • 2D/3D shape synthesis • Multi-component product design 💾 Data Representations: • Voxels & point clouds for 3D • Images for 2D designs • Parametric specs for manufacturing • Graphs for component relationships 🧮 Model Architectures: • GANs with various conditioning approaches • VAEs for latent space exploration • RL for sequential design decisions • Integration with physics-based simulation ⚖️ Loss Functions: • Performance metrics from simulation • Manufacturability constraints • Style transfer for design aesthetics • Multi-objective optimization 📊 Key Datasets: • UIUC airfoil database • ShapeNet/ModelNet for 3D shapes • BIKED bicycle design dataset • Material microstructure collections 📝 Reference: "Deep Generative Models in Engineering Design: A Review" by Regenwetter et al. https://lnkd.in/g_mMR-8y S2 Labs #EngineeringDesign #MachineLearning #TechnicalResearch

  • View profile for Sattyam Maurya

    Design Engineer @Cyient - Pratt and Whitney, USA || IIT Bombay, M.Tech, Design || B.Tech, BIET Jhansi ( Gold medalist 🥇) || 1 Million+ Impression, LinkedIn || 230k+ Views, YouTube▶️

    5,042 followers

    🚀 𝐓𝐨𝐩𝐨𝐥𝐨𝐠𝐲 𝐎𝐩𝐭𝐢𝐦𝐢𝐳𝐚𝐭𝐢𝐨𝐧: 𝐓𝐡𝐞 𝐅𝐮𝐭𝐮𝐫𝐞 𝐨𝐟 𝐒𝐭𝐫𝐮𝐜𝐭𝐮𝐫𝐚𝐥 𝐃𝐞𝐬𝐢𝐠𝐧 In today’s engineering world, the focus is shifting toward design efficiency, performance improvement, and sustainability. One of the most powerful methods driving this transformation is Topology Optimization. 🔹 𝑾𝒉𝒂𝒕 𝒊𝒔 𝒊𝒕? Topology optimization is a computational design approach that determines the most efficient way to distribute material within a defined design space—considering loads, constraints, and performance goals. 🔹 𝑾𝒉𝒚 𝒊𝒕 𝒎𝒂𝒕𝒕𝒆𝒓𝒔? ✅ Weight reduction ✅ Improved performance ✅ Cost savings ✅ Sustainability ✅ Design innovation ✅ Additive manufacturing compatibility ✅ Multiphysics integration 🔹 Industry Applications: Airbus – Wing rib for A380 optimized → ~40% lighter & 20% stiffer GE Aviation – Fuel nozzle redesigned via topology optimization & 3D printing → reduced part count, higher efficiency Volkswagen – Steering bracket optimized → ~50% lighter BMW – Engine mount redesign → 20% lighter, 15% cheaper ANSYS & Frustum – Medical & patient-specific implants optimized for strength and functionality Boeing – Structural aerospace systems via open-source FEM (Z88) From aerospace to automotive, medical to defense, topology optimization is revolutionizing the way we design and manufacture components. 🌍 The future of structural design lies not in adding more material, but in using material smartly. 🔧 As engineers and designers, embracing these methods will be key to building lighter, stronger, and more sustainable systems. 💡 What’s your take—Do you see topology optimization becoming a standard design practice across industries in the next decade? #Engineering #Design #TopologyOptimization #FiniteElementAnalysis #Innovation #Sustainability #AdditiveManufacturing #FiniteElementAnalysis #StructuralDesign #AdditiveManufacturing #DesignEngineering #GenerativeDesign #LightweightDesign #AerospaceEngineering #AutomotiveEngineering #MedicalDevices #SustainableDesign #FutureOfDesign #MechanicalEngineering #ProductDevelopment #EngineeringInnovation #AdvancedManufacturing #CADDesign #EngineeringExcellence #SmartDesign #3DPrintingInnovation #NextGenEngineering #EngineeringCommunity

  • View profile for Xavier JULLIEN

    Prototypist technician (Space Batteries Li-ion).

    1,052 followers

    🚫 What if we stopped over-dimensioning our parts? 🔍 Are you familiar with topological optimization? It’s a revolutionary approach that removes unnecessary material from a part while maintaining its mechanical performance. The result? Lightweight, strong, and highly efficient designs! 🚀 Using simulation software like SolidWorks, ANSYS, or Fusion 360, we can: ⚖️ Reduce part weight by up to 60% 🔩 Optimize stress distribution 💡 Improve performance, aesthetics, and—most importantly—material savings! Topological optimization is already a key tool in: ✈️ Aerospace 🛰️ Space industry 🏎️ Motorsport 🖨️ And especially additive manufacturing, which enables the production of complex geometries. Of course, there are challenges: ⚠️ Some designs cannot be machined conventionally 🛠️ It requires advanced tools and skilled engineers 🔄 Sometimes the model must be reinterpreted for industrial viability But one thing is clear: The future of design lies in intelligently lightweight parts! 🌟 What about you? Have you integrated topological optimization into your projects, or do you think it’s reserved for large industries? #TopologicalOptimization #MechanicalDesign #Engineering #SolidWorks #AdditiveManufacturing #Innovation #CAO #Simulation #MechanicalEngineering

  • View profile for Victor GUILLER

    Design of Experiments (DoE) Expert @L’Oréal | 💪 Empowering R&I Formulation labs with Data Science & Smart Experimentation | ⚫ Black Belt Lean Six Sigma | 🇫🇷 🇬🇧 🇩🇪

    2,977 followers

    🎉 Continuing the 2025 series on the foundations of Design of Experiments (#DoE) and modern experimentation approaches, here’s Part 3: Optimization Methods in Experimentation (a more complete version will be published on Medium soon). 🔎 𝐎𝐩𝐭𝐢𝐦𝐢𝐳𝐚𝐭𝐢𝐨𝐧 lies at the heart of experimental design, helping researchers and practitioners refine processes, improve performance, and uncover the best experimental conditions while minimizing resources. The evolution of optimization methods reflects a balance between leveraging models to guide experimentation and exploring unknown spaces without assumptions. 🏛️ 𝐌𝐨𝐝𝐞𝐥-𝐁𝐚𝐬𝐞𝐝 𝐎𝐩𝐭𝐢𝐦𝐢𝐳𝐚𝐭𝐢𝐨𝐧 Model-based approaches rely on predefined mathematical models to guide experiments. These include classical designs like Central Composite Designs (CCD) and Box-Behnken Designs, which assume polynomial models for response surfaces, as well as Bayesian Optimization, which combines surrogate models and acquisition functions to propose new experiments iteratively. These methods excel when a prior understanding of the system exists or when computational efficiency is key. 🌌 𝐌𝐨𝐝𝐞𝐥-𝐀𝐠𝐧𝐨𝐬𝐭𝐢𝐜 𝐎𝐩𝐭𝐢𝐦𝐢𝐳𝐚𝐭𝐢𝐨𝐧 In contrast, model-agnostic methods avoid assumptions about the underlying system, focusing instead on geometric or distance-based considerations. Space-filling designs, such as Latin Hypercube or Maximin designs, ensure even exploration of the experimental space, making them ideal for nonlinear responses. Simplex optimization methods, on the other hand, employ geometric steps to converge on optimal conditions, relying purely on iterative distance-based logic and results ranking. ⏳ 𝐒𝐞𝐪𝐮𝐞𝐧𝐭𝐢𝐚𝐥 𝐯𝐬. 𝐏𝐚𝐫𝐚𝐥𝐥𝐞𝐥 𝐄𝐱𝐩𝐞𝐫𝐢𝐦𝐞𝐧𝐭𝐚𝐭𝐢𝐨𝐧 Sequential methods, such as Bayesian Optimization or Simplex, prioritize small batches of experiments. These allow iterative learning and adaptation, particularly useful when resources are limited or when experiments are costly. Parallel approaches, favored in space-filling designs and model-based optimization like DoE, enable larger experiment batches to be conducted simultaneously, providing a more comprehensive understanding of the experimental space at the expense of iterative refinement. 🎯 𝐓𝐚𝐤𝐞𝐚𝐰𝐚𝐲 The choice of optimization method hinges on the balance between prior knowledge, resource availability, and the need for exploration versus exploitation. Sequential methods align with adaptive learning, while parallel methods accelerate discovery in larger spaces. Similarly, the decision between model-based and model-agnostic approaches depends on the complexity of the system and the availability of prior information. 📢 What optimization approaches have you found most effective in your experimental work ? #Optimization #ExperimentalDesign #DataScience #Innovation

Explore categories