🚀 Scientific Machine Learning: The Revolution of Computational Science with AI In recent years, we have seen impressive advances in Machine Learning (ML), but when it comes to scientific and engineering problems, a critical challenge remains: limited data and complex physical models. This is where Scientific Machine Learning (SciML) comes in—a field that combines machine learning with physics-based modeling to create more robust, interpretable, and efficient solutions. 🔹 Why isn’t traditional ML enough? Neural networks and statistical models are great at detecting patterns in large datasets, but many scientific phenomena have limited data or follow fundamental laws, such as the Navier-Stokes equations in fluid dynamics or Schrödinger’s equation in quantum mechanics. Training a purely data-driven model, without physical knowledge, can lead to inaccurate or physically inconsistent predictions. 🔹 What makes SciML different? SciML bridges data-driven models with partial differential equations (PDEs), physical laws, and structural knowledge, creating hybrid approaches that are more reliable. A classic example is Physics-Informed Neural Networks (PINNs), which embed differential equations directly into the loss function of the neural network. This allows solving complex simulation problems with high accuracy, even when data is scarce. 🔹 Real-world applications where SciML is already transforming science: ✅ Climate & Environment: Hybrid deep learning + atmospheric equations improve climate predictions. ✅ Engineering & Physics: Neural networks accelerate computational simulations in structural mechanics and fluid dynamics. ✅ Healthcare & Biotechnology: Simulations of molecular interactions for drug discovery. ✅ Energy & Sustainability: Optimized modeling of nuclear reactors and next-generation batteries. 🔹 Challenges and the future of SciML We still face issues such as high computational costs, training stability, and the pursuit of more interpretable models. However, as we continue to integrate deep learning with scientific principles, the potential of SciML to transform multiple fields is immense. 💡 Have you heard about Scientific Machine Learning before? If you work with computational physics, modeling, or applied machine learning, this is one of the most promising fields to explore! 🚀 #SciML #MachineLearning #AI #PhysicsInformed #DeepLearning #ComputationalScience
Knowledge-Based Approaches in Engineering
Explore top LinkedIn content from expert professionals.
Summary
Knowledge-based approaches in engineering use structured frameworks like knowledge graphs and physics-informed models to help machines understand and solve complex engineering problems with less data. These methods connect information, relationships, and rules, making solutions more reliable and easier to explain than purely data-driven techniques.
- Connect your data: Organize your engineering information using knowledge graphs to reveal hidden relationships and make smarter decisions across projects.
- Mix models smartly: Blend physical laws and expert insights into your machine learning models to boost accuracy, even when you don’t have massive datasets.
- Preserve context: Use techniques that keep the structure and connections of your data intact, so you can answer complex questions and spot patterns that chunked or unstructured methods might miss.
-
-
𝐀𝐫𝐭𝐢𝐟𝐢𝐜𝐢𝐚𝐥 𝐈𝐧𝐭𝐞𝐥𝐥𝐢𝐠𝐞𝐧𝐜𝐞 – 𝑨𝒕𝒕𝒆𝒏𝒕𝒊𝒐𝒏 𝑰𝒔 𝑨𝒍𝒍 𝒀𝒐𝒖 𝑵𝒆𝒆𝒅 ?!?🤔 The famous Transformer architecture, introduced in the paper 𝘈𝘵𝘵𝘦𝘯𝘵𝘪𝘰𝘯 𝘐𝘴 𝘈𝘭𝘭 𝘠𝘰𝘶 𝘕𝘦𝘦𝘥, has undoubtedly attracted a lot of attention and even more important has changed the world forever. However, it's the massive amounts of data that truly fuel these algorithms, a fact often overlooked. For many applications, this isn't a big issue (setting aside legal considerations): 📀 𝐆𝐏𝐓-3: Trained on 45TB of data 📀 𝐌𝐢𝐝𝐣𝐨𝐮𝐫𝐧𝐞𝐲: Trained on hundreds of millions of images 📀 𝐎𝐩𝐞𝐧𝐀𝐈 𝐂𝐨𝐝𝐞𝐱: 159 gigabytes of Python code from 54 million GitHub repositories But, for industrial and engineering applications, the scenario is quite different: 💾 𝐅𝐚𝐢𝐥𝐮𝐫𝐞 𝐃𝐚𝐭𝐚: Often measured in defects per million opportunities. Even with billions of parts, there might very likely less than 50,000 failures, e.g., in the automobile industry 💾 𝐂𝐅𝐃 𝐒𝐢𝐦𝐮𝐥𝐚𝐭𝐢𝐨𝐧𝐬: Single simulations can generate a terabyte of data, equivalent to over 100 days of standard-definition video. While for some use cases, gathering sufficient data is feasible and many of my colleagues at Siemens are working on this, numerous use cases still seem infeasible. 𝐃𝐢𝐟𝐟𝐞𝐫𝐞𝐧𝐭 𝐀𝐩𝐩𝐫𝐨𝐚𝐜𝐡𝐞𝐬 𝐚𝐫𝐞 𝐍𝐞𝐞𝐝𝐞𝐝: To address this, we need approaches that leverage existing knowledge or integrate physics reasoning within models to reduce data requirements. #DigitalTwins curating knowledge beyond data are key to achieve scalable solutions. Some promising technologies include: 🚀 𝐇𝐲𝐛𝐫𝐢𝐝 𝐒𝐢𝐦𝐮𝐥𝐚𝐭𝐢𝐨𝐧 𝐌𝐨𝐝𝐞𝐥𝐬: Combining physics-based models and data-based models. 🚀 𝐏𝐡𝐲𝐬𝐢𝐜𝐬-𝐈𝐧𝐟𝐨𝐫𝐦𝐞𝐝 𝐑𝐞𝐠𝐮𝐥𝐚𝐫𝐢𝐳𝐞𝐫𝐬: Incorporating physics knowledge in the learning function. 🚀 𝐊𝐧𝐨𝐰𝐥𝐞𝐝𝐠𝐞 𝐆𝐫𝐚𝐩𝐡𝐬: Allowing reasoning on well-curated data (e.g., as prototyped in our HiSimcenter Demo). 🚀 𝐅𝐢𝐫𝐬𝐭-𝐏𝐫𝐢𝐧𝐜𝐢𝐩𝐥𝐞 𝐌𝐚𝐜𝐡𝐢𝐧𝐞 𝐋𝐞𝐚𝐫𝐧𝐢𝐧𝐠: Unearthing first-principle relations (e.g., AI-Descartes, AI Feynman, AI-Lorentz). What are your thoughts? Any examples you could share? 🌟 Also check the comments for more links. 🔗 #ArtificialIntelligence #MachineLearning #DataScience #Engineering #AI
-
The Quiet Power of Knowledge Graphs in OpenBOM If you work in engineering or manufacturing, chances are you deal with a lot of disconnected data - BOMs, CAD files, revisions, vendors, purchase orders - they often live in different systems or spreadsheets. The real challenge is not just storing this data—but connecting it in a meaningful way. That’s where knowledge graphs come in. In OpenBOM, we’ve been used a graph-based model as a foundation a flexible data model of to help link data: A part to an assembly A CAD file to a revision A vendor to a purchase order A configuration to a customer These relationships form a network of knowledge that helps answer real-world questions like: Where is this part used? What’s affected if we change a supplier? Which assemblies use a certain material or component? In my article today, I explain: 🔹 What a knowledge graph is (with simple, real-world examples) 🔹 How OpenBOM uses it behind the scenes 🔹 Why this structure matters, especially as AI becomes part of everyday workflows 🔹 How the same graph powers the new OpenBOM AI Agent OpenBOM helps teams to connect the dots. The knowledge graph works in the background, supporting better decisions. And now, it also provides the structured information to AI agents need to understand context, follow relationships, and deliver accurate, traceable answers. 📖 Read the full article: The Quiet Power of Knowledge Graphs in OpenBOM 👉 [link in the comments] #OpenBOM #KnowledgeGraph #EngineeringData #PLM #DigitalThread #Manufacturing #ProductDevelopment #AI #BOMManagement #AIAgent
-
When applying methods like HAZOP, FMEA or PHA, the approach consists of decomposing a system into parts, finding possible changes in parameters of processes (e.g., temperature, pressure, flow) or failures of components in installations to anticipate scenarios, calculate their probabilities…and assess levels of risks. Known behaviours (sometimes ‘laws’) of phenomena for instance in thermodynamics or in chemistry are very useful to calculate and anticipate the extent of a pressure or heat increase while knowledge of components’ behaviours and their interactions in equipment allows to imagine possible breakdown, and their implications. Appreciating their likelihood has a subjective side but can also be based on data provided by experience (e.g., frequencies), or calculations. With this type of analysis comes the question of prevention and mitigation measures, and their reasonableness considering what has been analysed. Such techniques were developed from the mid-20th century by engineers precisely for this purpose, in sectors such as aviation, the chemical or nuclear industry, and these principles applied through risk assessment methods to engineered systems have proven their worth. By decomposing the system, by imagining potential breakdowns, by relying on our knowledge of phenomena, by introducing probabilities…such methods provided adequate approaches, and have improved over time. With the ambition of introducing humans (e.g., human reliability assessment - hra) then organisations in risk assessment, several questions were formulated (some time ago now) about the limits of the application of such principles to different kind of phenomena. In this article, I challenged the possibility of borrowing such methodological principles for this purpose. I used the discourse on complexity to make the point, by contrasting technical versus social systems (see visual). https://lnkd.in/egDyf2_B By creating bridges between natural, technical and social phenomena, system and complexity lenses have made it possible to question methodological principles such as that of risk assessment, also questioning their limits. It triggers many interesting avenues for alternative approaches. Inspiration comes from control theory, system theory, cybernetics, system dynamics, complex adaptive systems, agent-based modeling... These ideas have been variously incorporated in the proposals of system safety engineering, cognitive system engineering (Jens Rasmussen was influential in these traditions, https://lnkd.in/ey_pJwQg, ) but also… in the sociology of safety, and it is this last path that I opened in this article. (note also that the question of the limits of audit was already introduced in this article, https://lnkd.in/ecgxA8tg ).