Special Issues
Table of Content

Optimized Artificial Neural Network Architectures in Engineering

Submission Deadline: 31 May 2025 View: 244 Submit to Special Issue

Guest Editors

Prof. Zong Woo Geem, College of IT Convergence, Gachon University, Seongnam, 03075, South Korea
Prof. Gebrail Bekdaş, Department of Civil Engineering, Istanbul University - Cerrahpaşa, Istanbul, 34000, Turkey
Prof. Umit Isikdag, Department of Architecture, Mimar Sinan Fine Arts University, Istanbul, 34000, Turkey

Summary

The research on Artificial Neural Networks (ANNs) has become very popular since the millennium and has moved into a new dimension since 2010, with the emergence of Deep (Artificial) Neural Networks (DNNs). Today ANNs and DNNs have been widely used in areas ranging from classification and regression-based predictions on tabular data to accomplishing various key tasks of computer vision and natural language processing. In addition, the models today not only focus on processing information but also focus on the generation of new information through new approaches including Generative Adversarial Networks and Transformer Based Large Language and Large Computer Vision Models. In coming years ANNs and DNNs will become a de-facto standard in information processing and generation for a wide scope of engineering applications, ranging from analysis and design to testing and simulation of engineering systems. In parallel, the research on finding new or better architectures for ANNs and DNNs is constantly growing. The above-mentioned Transformer architectures are a key example of how new architectures can enhance the applicability and practicality of the networks, such as facilitation of the development of new models such as Diffusion Models and GPTs. The key factor that has a great impact on the success of the architectures is not only testing the new ones but also enhancing the accuracy and speed of the existing and new architectures. In this context methods such as Neural Architecture Search (NAS) and Hyperparameter Optimization are of key importance for enhancing Artificial and Deep Neural Networks.

 

This special issue aims to present new methods and techniques on Neural Architecture Search (NAS) and Hyperparameter Optimization for improving analysis, design, testing, and simulation in engineering systems.

The potential topics include but are not limited to:

Hyperparameter optimization for Engineering Design

Hyperparameter optimization for Engineering Analysis

Hyperparameter optimization for Engineering Testing & Simulation

Neural Architecture Search for Engineering Design

Neural Architecture Search for Engineering Analysis

Neural Architecture Search for Engineering Testing & Simulation

Optimized ANN architectures for Prediction from Tabular Data in Engineering

Optimized ANN architectures for Computer Vision in Engineering

Optimized ANN architectures for Application of Diffusion Models in Engineering

Optimized ANN architectures for Application of Large Language Models in Engineering

Optimized ANN architectures for Multi Modal Models in Engineering


Keywords

Hyperparameter Optimization, Neural Architecture Search, Optimized ANN architectures, Computer Vision, Diffusion Models, Large Language Models, Multi Modal Models

Share Link