0

ASME Conference Presenter Attendance Policy and Archival Proceedings

2016;():V01BT00A001. doi:10.1115/DETC2016-NS1B.
FREE TO VIEW

This online compilation of papers from the ASME 2016 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference (IDETC/CIE2016) represents the archival version of the Conference Proceedings. According to ASME’s conference presenter attendance policy, if a paper is not presented at the Conference by an author of the paper, the paper will not be published in the official archival Proceedings, which are registered with the Library of Congress and are submitted for abstracting and indexing. The paper also will not be published in The ASME Digital Collection and may not be cited as a published paper.

Commentary by Dr. Valentin Fuster

36th Computers and Information in Engineering Conference: Computer-Aided Product and Process Development (CAPPD General)

2016;():V01BT02A001. doi:10.1115/DETC2016-59120.

Within the Collaborative Research Center 666 the algorithm based product development process has been established. It is based on state of the art product development methodologies and enhanced in order to optimize the product development process of integral bifurcated sheet metal parts. Algorithms based on mathematical optimization approaches as well as the initial product requirements and constraints information are applied to obtain an optimized design as CAD-Model. Regarding this methodology there are still some challenges to be solved, such as reduction of iterations steps to elaborate final product design as CAD-model, use of heterogeneous data as well as software and enhancement of information exchange.

Therefore, this paper introduces a concept for a web-based application to support the algorithmic product development methodology and CAD modeling in CRC 666. It enables the development and adaptation of integral bifurcated parts based on the initial optimization data provided by XML-files. Besides the description of use cases and use scenarios, the concept is implemented as a web-based application for validation purposes. Based on the validation, advantages and limitations of the presented approach are discussed.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A002. doi:10.1115/DETC2016-59239.

Variational design is an efficient method for product design; statistics indicates that about 70% of product design is attributed to variational design. It is of vital importance for enterprises to achieve a complex new product through variational design while reusing the previous part model series, which can mostly improve their capability and efficiency of product design and development. Therefore, this paper puts forward an efficient variational design method based on a master model which is automatically generated from a part model series. Using the method, the designers can easily and efficiently create a new part from the master model only through feature selection and parameter modification. In this way, the work of the designers can be greatly alleviated and the design time can be greatly reduced. Experimental results show that the proposed method is effective.

Topics: Design
Commentary by Dr. Valentin Fuster
2016;():V01BT02A003. doi:10.1115/DETC2016-59242.

Reverse Engineering (RE), also known as “CAD reconstruction”, aims at the reconstruction of 3D geometric models of objects/mechanical parts, starting from 3D measured data (points/mesh). In recent years, considerable developments in RE were achieved thanks to both academic and industrial research (e.g. RE software packages).

The aim of this work is to provide an overview of state of the art techniques and approaches presented in recent years (considering at the same time tools and methods provided by commercial CAD software and RE systems). In particular, this article focuses on the “constrained fitting” approach, which considers geometrical constraints between the generated surfaces, improving the reconstruction result.

On the basis of the overview, possible theoretical principles are drafted with the aim of suggest new strategies to make the CAD reconstruction process more effective in order to obtain more ready/usable CAD models.

Finally, a new RE framework is briefly outlined: the proposed approach hypothesizes a tool built within the environment of an existing CAD system and considers the fitting of a custom-built archetypal model, defined with all the a-priori known dimensions and constraints, to the scanned data.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A004. doi:10.1115/DETC2016-59432.

A general approach to automatically calculate a datum feature simulator (DFS) for a real part model is proposed in this paper. The geometric errors of the real part are represented by the controlling point variation model (CPVM) of the geometric feature, and the geometric deviations are simulated and generated by the Monte Carlo method. The linear feature and planar feature CPVM models are first introduced; these models can simulate all possible size, position, form, and orientation variations. Furthermore, these models are compatible with the ASME/ISO Standards for geometric tolerances. The determining rules of DFS based on the CPVM model are presented, according to the definitions of DFS by ASME standards. The CPVM models for three common datum features, i.e., the planar datum feature, cylindrical datum feature, and prismatic datum feature, are then established, and the algorithms to determine the corresponding DFSs for a different order of datum precedence are developed. An example is given to demonstrate the establishing method.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A005. doi:10.1115/DETC2016-59476.

Efforts towards a circular economy, increased possibilities of sensor technology and monitoring, and the developments in smart, connected products and systems have all contributed to the shift from selling products towards selling service-based offerings. Product-service systems are one example of service-based offerings, combining physical products or systems to service-based earnings logic.

Many traditional companies might benefit from exploring the possibilities of service-based offerings, but the tools for modeling, simulating and analyzing service-based offerings are somewhat lacking. There are numerous existing methods for modeling products on one hand, or business and economic models on the other, but fewer combining the two. This article presents a modeling framework for describing, analyzing and simulating offerings consisting of interdependent technical systems and business processes, based on a an extension of previous work on the Dimensional Analysis Conceptual Modeling (DACM) framework.

The goal of the framework is to allow modeling combinations of technical systems and business processes, resulting in flows of revenue between the stakeholders over a period of time. Further, the framework can be used to identify objectives of the stakeholders, and identify contradictions on a variable level, or conflicting objectives on a system level, and to propose solutions for solving the conflicts and contradictions based on general solution principles. In the article, the framework is applied on a case study of designing a reverse vending machine.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A006. doi:10.1115/DETC2016-59769.

Functional Modeling allows a direct, and sometimes abstract, method for depicting a product. Through this method, product architecture, concept generation and physical modeling can be used to obtain repeatable and more meaningful results. The Functional Basis approach of engineering design, as taught to engineering design students, provides the vocabulary to produce a uniform approach to function structures with functions (verbs) and flows (nouns). This paper suggests that the flows, particularly the “signal” flows, can be correlated to additional domains domain through transfer functions common in controls engineering. Controls engineering employs transfer functions to mathematically represent the physical or digital functions of a system or product using block diagrams to show the individual steps. The research herein suggests the correlations between the mathematical representations of transfer functions and the functional basis of engineering design through the actions performed upon “signal” flows. Specifically, the methodologies employed by controls engineering can relate to engineering design by 1) Schematic similarities, 2) Quantifiable performance metric inputs/outputs, 3) Mathematical representations of the flows, and 4) isomorphic matching of the schematics. Controls systems use block diagrams to represent the sequential steps of the system, These block diagrams parallel the functions structures of engineering design. Performance metrics between the two domains can be complimentary when decomposed down to non-dimensional engineering units. Mathematical Functions of the actions in a controls systems can resemble the functional basis functions through the use if bond graphs by identifying characteristic behavior of the functions on the flows. Isomorphic matching using the schematic diagrams can be used to find analogies based upon similar functionality and target performance metrics. When these four similarities are performed, parallels between the engineering domain and the controls engineering can be establish. Examples of cross-domain matching via transfer functions and controls systems are provided as contextualization for the concepts proposed. Pathways forward for this preliminary research are additionally suggested.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A007. doi:10.1115/DETC2016-60265.

One of the most important factors that allows the big enterprises to be competitive in the market is the capability to develop products that are tailored on the specific customer requirements with a short lead-time. This research aims to study an automated configuration tool for assembly lines. A KBE approach was used. The configuration process for an assembly line has been identified with the experts of a big manufacturing Italian company. One of the critical issue is that the domains knowledge is mostly tacit. The implementation stage was carried out with a commercial development tool. Thanks to a modular design of the KBE system a strong but flexible framework was achieved. The rules and data repositories, as well as the code of the single module, can be update easier. An industrial case study was used to validate the proposed approach: the configuration of a cylinder head assembly line was performed.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A008. doi:10.1115/DETC2016-60353.

This paper presents an approach for feature-based computer-aided modeling of functions. Features are used in geometric CAD as a means to encapsulate primitive entities and operations into more complex forms that have engineering significance, which also allows faster modeling, uniformity of data sets between similar features, and reasoning support at the features-level. In a recent research, a formal language for functions has been proposed that ensures consistency of function models against physics, esp. the balance laws of mass and energy. The language is implemented in a software tool to support physics-based reasoning. In this paper, the primitive entities and relations of this language and tool are encapsulated to define more complex function features that have engineering significance. To demonstrate the approach and its benefits, three common functions from the Functional Basis vocabulary are defined as features and used in models, which are then used to show the reasoning potential of this approach.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A009. doi:10.1115/DETC2016-60461.

The aim of this research is to support assembly lines designers in conceiving new processes with optimal automation levels selection. Several alternatives with various automation options may exist. Graphic representations and analyses of the different designs are needed. The finality is to offer a quick, exhaustive, and reliable way of modelling alternatives based on a given product design. In this sense we propose a new assembly tasks vocabulary to be combined to an existing lower layer vocabulary of elementary motions and a graphic modelling language. These developments deal with an existing automation decision approach as an extension allowing to overcome identified gaps and to ease its implementation and computerization. The proposal facilitates assembly systems alternatives generation with automation options consideration based on an initial representation. The generated alternatives are then subject to further analyses with regard to automation criteria and performance indicators considering planned production targets.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A010. doi:10.1115/DETC2016-60521.

This paper presents a novel algorithm “Twig Match” for feature based shape retrieval systems. The algorithm exploits recent advances in computational methods for subgraph isomorphism, in order to enable databases containing many thousands of components to be searched in less than a second. A face adjacency graph representation is created from a B-Rep model, allowing model comparison to be treated as a labelled subgraph isomorphism problem. This paper describes an experimental implementation which allows interactive specification of a target “feature”. By selectively including geometric filters, on faces and relations between neighbouring faces, the algorithm can ensure that matching topology is not incorrectly identified as matching geometry, while also offering users the ability to improve the precision of both query and results. Experimental results show that Twig Match accurately retrieves matching and similar sub-parts from collections at speeds suitable for interactive applications.

Topics: Databases
Commentary by Dr. Valentin Fuster

36th Computers and Information in Engineering Conference: DAC 6 Design for Resilience and Failure Recovery

2016;():V01BT02A011. doi:10.1115/DETC2016-59426.

The objects in the Internet of Things (IoT) form a virtual space of information gathering and sharing through the networks. Designing IoT-compatible products that have the capabilities of data collection, processing, and communication requires open and resilient architecture with flexibility and adapability for dynamically evolving networks. Design for connectivity becomes an important subject in designing such products. To enable a resilience engineering approach for IoT systems design, quantitative measures of resilience are needed for analysis and optimization. In this paper, an approach for probabilistic design of IoT system architecture is proposed, where resilience is quantified with entropy and mutual information associated with the probabilities of detection, prediction, and communication among IoT-compatible products. Information fusion rules and sensitivities are also studied.

Topics: Design , Internet , Resilience
Commentary by Dr. Valentin Fuster
2016;():V01BT02A012. doi:10.1115/DETC2016-59691.

This work proposes a new methodology for robust design optimization (RDO) of complex engineering systems. The method, capable of solving large-scale RDO problems, involves (1) an adaptive-sparse polynomial dimensional decomposition (AS-PDD) for stochastic moment analysis of a high-dimensional stochastic response, (2) a novel integration of score functions and AS-PDD for design sensitivity analysis, and (3) a multi-point design process, facilitating standard gradient-based optimization algorithms. Closed-form formulae are developed for first two moments and their design sensitivities. The method allow that both the stochastic moments and their design sensitivities can be concurrently determined from a single stochastic simulation or analysis. Precisely for this reason, the multi-point framework of the proposed method affords the ability of solving industrial-scale problems with large design spaces. The robust shape optimization of a three-hole bracket was accomplished, demonstrating the efficiency of the new method to solve industry-scale RDO problems.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A013. doi:10.1115/DETC2016-60002.

Operation of autonomous and semi-autonomous systems in hostile and expensive-to-access environments requires great care and a risk-informed operating mentality to protect critical system assets. Space exploration missions, such as the Mars Exploration Rover systems Opportunity and Curiosity, are very costly and difficult to replace. These systems are operated in a very risk-averse manner to preserve the functionality of the systems. By constraining system operations to risk-averse activities, scientific mission goals cannot be achieved if they are deemed too risky. We present a quantifiable method that increases the lifetime efficiency of obtaining scientific goals via the implementation of the Goal-Oriented, Risk Attitude-Driven Reward Optimization (GORADRO) method and a case study conducted with simulated testing of the method. GORADRO relies upon local area information obtained by the system during operations and internal Prognostics and Health Management (PHM) information to determine system health and potential localized risks such as areas where a system may become trapped (e.g.: sand pits, overhangs, overly steep slopes, etc.) while attempting to access scientific mission objectives through using an adaptable operating risk attitude. The results of our simulations and hardware validation using GORADRO show a large increase in the lifetime performance of autonomous rovers in a variety of environments, terrains, and situations given a sufficiently tuned set of risk attitude parameters. Through designing a GORADRO behavioral risk attitude set of parameters, it is possible to increase system resilience in unknown and dangerous environments encountered in space exploration and other similarly hazardous environments.

Topics: Vehicles , Risk , Resilience
Commentary by Dr. Valentin Fuster
2016;():V01BT02A014. doi:10.1115/DETC2016-60230.

Traditional engineering design practice seeks to create reliable systems that maintain a desired minimum performance when subjected to a defined set of impulses. To manage impulses, designers implement techniques to specify systems that are resilient or robust to impulses. Resilient systems perform with degraded capacity when subjected to impulses while robust systems remain unaffected by impulses. In this paper we examine antifragility, a complement to resilience and robustness, to manage the impulse response of complex cyber systems. Where fragile systems fracture when subjected to impulses, antifragile systems become stronger. We discuss why this strengthening characteristic makes antifragility attractive for managing impulse response in complex cyber systems and develop a measure for antifragility that differentiates it from fragility, resiliency and robustness. We then discuss an antifragile cyber system to demonstrate the benefits of antifragility in an impulse-rich environment.

Commentary by Dr. Valentin Fuster

36th Computers and Information in Engineering Conference: SEIKM: Systems Engineering

2016;():V01BT02A015. doi:10.1115/DETC2016-59916.

Fault detection and identification (FDI) systems, which are based on data mining and artificial intelligence techniques, cannot guarantee a perfect success rate or provide analytical proof for their predictions. This characteristic is problematic when such an FDI system is monitoring a safety-critical process. In these cases, the predictions of the FDI system need to be verified by other means, such as tests on the process, to increase trust in the diagnosis. This paper contributes an extension of the Hierarchical Functional Fault Detection and Identification (HFFDI) system, a combination of a plant-wide and multiple function-specific FDI modules, developed in past research. A test preparation and test-based verification phase is added to the HFFDI methodology. The functional decomposition of the process and the type of the faulty components guides the preparation of specific tests for every fault to be identifiable by the HFFDI system. These tests have the potential to confirm or disprove the existence of the fault(s) in the target process. The target is minor automation faults in redundant systems of the monitored process. The proposed extension of the HFFDI system is applied to a case study of a generic Nuclear Power Plant model. Two HFFDI predictions are tested (a successful and an incorrect prediction) in single fault scenarios and one prediction is tested in a in a two fault scenario. The results of the case study show that the testing phase introduced in this paper is able to confirm correct fault predictions and reject incorrect fault predictions, thus the HFFDI extension presented here improves the confidence of the HFFDI output.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A016. doi:10.1115/DETC2016-60036.

The system-level structure and performance of complex networked systems (e.g., the Internet) are emergent outcomes resulting from the interactions among individual entities (e.g., the autonomous systems in the Internet). Thus, the systems evolve in a bottom-up manner. In our previous studies, we have proposed a framework towards laying complex systems engineering on decision-centric foundations. In this paper, we apply that framework on modeling and analyzing the structure and performance of complex networked systems through the integration of random utility theory, continuum theory and percolation theory. Specifically, we propose a degree-based decision-centric (DBDC) network model based on random utility theory. We analyze the degree distribution and robustness of networks generated by the DBDC model using continuum theory and percolation theory, respectively. The results show that by controlling node-level preferences, the model is capable of generating a variety of network topologies. Further, the robustness of networks is observed to be highly sensitive to the nodes’ preferences to degree. The proposed decision-centric approach has two advantages: 1) it provides a more general model for modeling networked systems by considering node-level preferences, and 2) the model can be extended by including non-structural attributes of nodes. With the proposed approach, systems that are evolved in a bottom-up manner can be modeled to verify hypothesized evolution mechanisms. This helps in understanding the underlying principles governing systems evolution, which is crucial to the development of design and engineering strategies for complex networked systems.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A017. doi:10.1115/DETC2016-60101.

This paper presents a methodology and tools to synthesize and assess different System-Level Designs. It utilizes the descriptive modeling language Object Process Methodology (OPM) to hierarchically describe the functionality of the System of Interest which is mapped by way of various intermediary models to an architecture in the Modelica numerical modeling language. The resulting Modelica architecture model is subsequently used as a framework for the rapid creation of alternative System of Interest designs by the variation of components.

To enable consistent assessment of the alternatives, Assessment Scenarios are created based on the functionality identified by OPM decomposition. By defining a hierarchy of Modelica models with the Assessment Scenarios at the top, all the System of Interest alternative models can be composed into the Assessment Scenarios and the resulting models simulated. With a combined score for each alternative across all the Assessment Scenarios being computed by way of Multi Objective Decision Analysis (MODA).

This paper demonstrates the approach with an actual student solar powered autonomous boat development project.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A018. doi:10.1115/DETC2016-60482.

For many complex engineered systems, a risk informed approach to design is critical to ensure both robust safety and system reliability. Early identification of failure paths in complex systems can greatly reduce the costs and risks absorbed by a project in future failure mitigation strategies. By exploring the functional effect of potential failures, designers can identify preferred architectures and technologies prior to acquiring specific knowledge of detailed physical system forms and behaviors. Early design-stage failure analysis is enabled by model-based design, with several research methodologies having been developed to support this design stage analysis through the use of computational models. The abstraction necessary for implementation at the design stage, however, leads to challenges in validating the analysis results presented by these models.

This paper describes initial work on the comparison of models at varying levels of abstraction with results obtained on an experimental testbed in an effort to validate a function-based failure analysis method. Specifically, the potential functional losses of a simple rover vehicle are compared with experimental findings of similar failure scenarios. Expected results of the validation procedure suggest that a model’s validity and quality are a function of the depth to which functional details are described.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A019. doi:10.1115/DETC2016-60483.

Cyber-physical systems are gaining momentum in the domain of manufacturing. Cloud Manufacturing is also revolutionizing the manufacturing world. However, although there exist numerous physical manufacturing machines which are network-ready, very few of them are operated in a networked environment due to lack of scalability of existing cyber-physical systems. Combining the features offered by cloud manufacturing and cyber-physical systems, we develop a service-oriented architecture of scalable cyber-physical manufacturing cloud with MTConnect. A testbed of cyber-physical manufacturing cloud is being developed based on the above scalable architecture. In this system, manufacturing machines and their capabilities virtualized in a cyber-physical cloud. Manufacturing operations are represented as web services so that they are accessible across the Internet. Performance of the testbed of our cyber-physical manufacturing cloud with MTConnect is evaluated and test results show that our system achieves excellent service performance of manufacturing operations across Internet.

Topics: Manufacturing , Design
Commentary by Dr. Valentin Fuster

36th Computers and Information in Engineering Conference: SEIKM: Design Informatics

2016;():V01BT02A020. doi:10.1115/DETC2016-59645.

Usage context is considered a critical driving factor for customers’ product choices. In addition, the physical use of a product (i.e., user-product interaction) dictates a number of customer perceptions (e.g. level of comfort, ease-of-use or users’ physical fatigue). In the emerging Internet-of-Things (IoT), this work hypothesizes that it is possible to understand product usage while it is ‘in-use’ by capturing the user-product interaction data. Mining the data and understanding the comfort of the user adds a new dimension to the product design field. There has been tremendous progress in the field of data analytics, but the application in product design is still nascent. In this work, application of ‘feature learning’ methods for the identification of product usage context is demonstrated, where usage context is limited to the activity of the user. Two feature learning methods are applied for a walking activity classification using smartphone accelerometer data. Results are compared with feature-based machine learning algorithms (neural networks and support vector machines), and demonstrate the benefits of using the ‘feature learning’ methods over the feature based machine-learning algorithms.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A021. doi:10.1115/DETC2016-59662.

This paper explores the amount of information stored in the representational components of a function structure: vocabulary, grammar, and topology. This is done by classifying the previously developed functional composition rules into vocabulary, grammatical, and topological classes and applying them to function structures available in an external design repository. The pruned function structures of electromechanical devices are then evaluated for how accurately market values can be predicted using graph complexity connectivity method. The accuracy is inversely with amount of information and level of detail. Applying the topological rule does not significantly impact the predictive power of the models, while applying the vocabulary rules and the grammar rules reduce the accuracy of the predictions. Finally, the least predictive model set is that which had all rules applied. In this manner, the value of a representation to predict or answer questions is quantified through this research approach.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A022. doi:10.1115/DETC2016-59677.

Many large-scale, complex engineering systems experience significant cost and schedule overruns during their developments. There are many factors that contribute to these overruns, including increased system complexity, task rework, and the in-ability to exhaustively test all states and configurations of a given system, which often leads to re-design efforts. In this work, we focus specifically on task rework and its impact on project schedule overruns. We demonstrate that heavier tail phenomena present in large-scale program development duration can be caused by task rework. Within this context, we hypothesize that one cause of task rework in a project development effort is misinterpretation of task instructions. We develop a computational framework for estimating the information content of a set of task instructions and the expected time to task completion. This reveals that heavier tailed duration phenomena present in many large-scale project development efforts can arise due to task rework caused by misinterpretation.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A023. doi:10.1115/DETC2016-59772.

Until now, translating product features expressed in the market into quantifiable engineering metrics has primarily been a manual process. This manual process establishes product features from large-scale customer feedback using a product’s components from large-scale design specifications. This process exacerbates the complexity and sheer amount of information that designers must handle during the early stages of new product development. The methodology proposed in this paper automatically identifies product features by mapping terms that describe product features from technical descriptions and customer reviews. In order to discover terms related to the features expressed in the market, the authors of this work employ WordNet and the PageRank algorithm, which search for semantically similar terms in products’ technical descriptions. A case study demonstrates the methodology’s viability for matching product features that are extracted from online customer reviews to the technical descriptions that address them.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A024. doi:10.1115/DETC2016-59829.

The increasing design documents created in the design process provide a useful source of process-oriented design information. Hence, the need for automated design information extraction using advanced text mining techniques is increasing. However, most of the existing text mining approaches have problems in mining design information in depth, which results in low efficiency in applying the discovered information to improve the design project. With the aim of extracting process-oriented design information from design documents in depth, this paper proposes a layered text mining approach that produces a hierarchical process model which captures the process behavior at the different level of details. Our approach consists of several interrelated algorithms, namely, a content-based document clustering algorithm, a hybrid named entity recognition (NER) algorithm and a frequency-based entity relationship detection method, which have been integrated into a system architecture for extracting design information from coarse-grained views to fine-grained specifications. To evaluate the performance of the proposed algorithms, experiments were conducted on an email archive that was collected from a real-life design project. The results showed an increase in the detection accuracy for the process-oriented information detection.

Commentary by Dr. Valentin Fuster

36th Computers and Information in Engineering Conference: SEIKM: Knowledge Capture, Reuse, and Management

2016;():V01BT02A025. doi:10.1115/DETC2016-59177.

Now that all kinds of products are increasingly getting connected to the Internet, it is expected that it will become easier to collect data on how they are actually used during the middle-of-life stage of their product lifecycles. At the same time, a growing number of data analytics technologies offers opportunities to transform this data into actionable knowledge. Over the years, such knowledge extracted from usage data has already become a reliable input for managing maintenance and related services, but other uses such as feedback to design — where product data management systems have started to offer support for data collection practices — and providing advice to end users are now also being considered. Most data from sensors and other product-embedded information devices are collected in batches and analyzed retrospectively. In order for companies to further benefit from data collection in terms of efficacy and acceptance in society, two key challenges are (i) finding ways to effectively use data analytics techniques — which currently do not seem to be used to their full potential, and (ii) finding a good trade-off between respecting privacy and yet producing useful knowledge.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A026. doi:10.1115/DETC2016-59226.

This paper proposes a Hierarchical Bayesian Network (HBN) approach to estimate the uncertainty in performance prediction of manufacturing processes by aggregating the uncertainty arising from multiple models at multiple levels. A HBN is an extension of a Bayesian network (BN) for modeling hierarchical or multi-level systems where each node may represent a lower-level BN. The BNs at different levels can be constructed either using physics-based models or available data or by a hybrid approach through a combination of physics-based models and data. An improved BN learning algorithm is presented where the topology is learnt using an existing algorithm but different parametric and non-parametric models are fit to represent the conditional probabilities. Data for model calibration may be available at multiple levels such as at the unit process level or line level or sometimes at the factory level. Using all the data for calibration can be computationally expensive; therefore, a multi-level segmented approach for model calibration is developed. The injection molding process is used to demonstrate the proposed methodologies for uncertainty prediction in its energy consumption.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A027. doi:10.1115/DETC2016-59405.

It is efficacious to capture and represent the knowledge for decision support in engineering design. Ontology is a promising knowledge modeling scheme in the engineering domain. In this paper, an ontology is proposed for capturing, representing and documenting the knowledge related to hierarchical decisions in the design of complex engineered systems. The ontology is developed based on the coupled Decision Support Problem (DSP) construct, taking into consideration the requirements for a computational model that represents a decision hierarchy. Key to the ontology is the concept of two classes, namely, Process which represents the basic hierarchy building blocks where the DSPs are embedded, and Interface which represents the DSP information flows that link different Processes to a hierarchy. The efficacy of the ontology is demonstrated using a portal frame design example.

Topics: Design , Ontologies
Commentary by Dr. Valentin Fuster
2016;():V01BT02A028. doi:10.1115/DETC2016-59594.

Research shows that failures in the standardization process often result from communication and organizational issues between those involved in the committee and the user community. This is mainly caused by two issues: first, a lack of integration of available standards development tools with communication and social interfaces; and second, to the difficulties inherent in organizing and collating information in a semantically meaningful manner. To this effect, the authors present a Visual Ontological Language for Technical Standards (VOLTS). VOLTS is a prototype environment that seeks to address the latter problem introduced above. In VOLTS, standards developers visually create standards within a network of information. VOLTS builds upon a tool developed by the National Institute of Standards and Technology (NIST) called the NIST Ontological Visualization Interface for Standards (NOVIS), which presented a novel method for visualizing the content and connections of standards, but lacked the ability to allow users to alter that information. VOLTS focuses on providing users with a process that allows for verification and validation at all stages of development. To that effect, VOLTS incorporates research done by NIST on building a Framework for Analysis Comparison, and Test of Standards (FACTS). The examples presented herein use the openly available standards World Wide Web Consortium’s (W3C) Web Ontology Language (OWL) 2 and the Data Mining Group’s (DMG) Predictive Model Markup Language (PMML) to demonstrate the VOLTS process and methodology. Future work discussed will seek to address the former problem introduced above.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A029. doi:10.1115/DETC2016-59915.

In this paper we demonstrate and compare two complementary approaches to the automatic generation of production rules from a set of given graphs representing sample designs. The first approach generates a complete rule set from scratch by means of frequent subgraph discovery. Whereas the second approach is intended to learn additional rules that fit an existing, yet incomplete, rule set using genetic programming. Both approaches have been developed and tested in the context of an application for automated conceptual engineering design, more specifically functional decomposition. They can be considered feasible, complementary approaches to the automatic inference of graph rewriting rules for conceptual design applications.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A030. doi:10.1115/DETC2016-59964.

The development of manufacturing technologies for new materials involves the generation of large and continually evolving volumes of data. The analysis, integration and management of these data, typically stored in multiple independently developed databases, creates significant challenges for practitioners. Strategies which allow open sharing of pre-competitive data pertaining to engineering design can play a powerful role in enabling innovation, but these strategies can work only if the data themselves can be presented in a way that is both consistent and understandable to both humans and computers.

We believe that ontology applied to engineering (OE) represents a viable strategy for the alignment, reconciliation and integration of diverse and disparate data. The scope of OE includes: consistent capture of knowledge pertaining to the types of entities involved; facilitation of cooperation among diverse groups of experts; effective and flexible ongoing curation and update of data; and collaborative design and knowledge reuse.

We propose as case study an ontology for the domain of composite materials focusing in particular on the class of ‘Functionally Graded Materials’ (FGM) with examples drawn from the field of biomedical applications. The goal of the ontology is to provide information about the components of such materials, the manufacturing processes involved in their creation, and their applications, ranging from additive manufacturing to restorative dentistry. The ontology is developed using Basic Formal Ontology (BFO) and parts of the Ontology for Biomedical Investigation (OBI) and follows the best practice principles for ontology development codified in the OBO (Open Biomedical Ontologies) Foundry.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A031. doi:10.1115/DETC2016-60159.

Distributed computer integrated manufacturing is increasingly adopting cloud computing, software-as-a-service (SaaS) and multi-agent systems as steps towards “design anywhere, build anywhere” strategy. In this scenario, foundation ontologies not only serve as common message exchange structure among distributed agents but also provide reasoning service to extract implicit knowledge from explicit information already stored in the knowledge base. Foundation ontologies, comprised of most general concepts of a domain, provide a common semantic structure to the domain-level ontologies, which capture details of multi-disciplinary manufacturing knowledge. In this paper, foundation ontology for manufacturing process planning is proposed and manufacturing process selection information of a sample prismatic design feature is modeled using the proposed foundation ontology, as a case study.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A032. doi:10.1115/DETC2016-60196.

Design for additive manufacturing (DFAM) gives designers new freedoms to create complex geometries and combine parts into one. However, it has its own limitations, and more importantly, requires a shift in thinking from traditional design for subtractive manufacturing. There is a lack of formal and structured guidelines, especially for novice designers. To formalize knowledge of DFAM, we have developed an ontology using formal OWL/RDF representations in the Protégé tool. The description logic formalism facilitates expressing domain knowledge as well as capturing information from benchmark studies. This is demonstrated in a case study with three design features: revolute joint, thread assembly (screw connection), and slider-crank. How multiple instances (build events) are stored and retrieved in the knowledge base is discussed in light of modeling requirements for the DFAM knowledge base: knowledge capture and reuse, supporting a tutoring system, integration into CAD tools. A set of competency questions are described to evaluate knowledge retrieval. Examples are given with SPARQL queries. Knowledge documentation is the main objective of the current ontology. However, description logic creates multiple opportunities for future work, including representing and reasoning about DFAM rules in a structured modular hierarchy, discovering new rules with induction, and recognizing patterns with classification, e.g., what leads to “successful” vs. “unsuccessful” fabrications.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A033. doi:10.1115/DETC2016-60392.

Software tools, knowledge of materials and processes, and data provide three pillars on which Additive Manufacturing (AM) lifecycles and value chains can be supported. These pillars leverage efforts dedicated to the development of AM databases, high-fidelity models, and design and planning support tools. However, as of today, it remains a challenge to integrate distributed AM data and heterogeneous predictive models in software tools to drive a more collaborative AM development environment. In this paper, we describe the development of an analytical framework for integrated and collaborative AM development. Information correlating material, product design, process planning and manufacturing operations are captured and managed in the analytical framework. A layered structure is adopted to support the composability of data, models and knowledge bases. The key technologies to enable composability are discussed along with a suite of tools that assist designers in the management of data, models and knowledge components. A proof-of-concept case study demonstrates the potential of the AM analytical framework.

Commentary by Dr. Valentin Fuster

36th Computers and Information in Engineering Conference: Systems Engineering Information Knowledge Management (SEIKM General)

2016;():V01BT02A034. doi:10.1115/DETC2016-59469.

Design optimization of a complicated system requires modeling multidisciplinary engineering analysis for the verification of its physical behavior, and sophisticated formalization of the optimization problem. Although a CAE system and a package software of optimization can support the calculation and visualization of the results, it is important to make the appropriate models and the appropriate formalization considering critical phenomena, modeling conditions, the level of details and so on. These processes remain with the realm of tacit knowledge of engineers. This paper proposes a new management method that facilitates a designer to clarify the empirical knowledge of modeling and formulation in the reflective design process of multidisciplinary system, and flexibly operate them during design process. A knowledge representation format based on a function modeling method of IDEFO is developed in order to represent and operate the whole/part relationships and the causal relationships among design parameters, and the contents of the optimization formulation. The ability of the proposed method is verified with a design example of a desiccant air-conditioning system.

Topics: Design , Modeling
Commentary by Dr. Valentin Fuster
2016;():V01BT02A035. doi:10.1115/DETC2016-59721.

Smart manufacturing combines advanced manufacturing capabilities and digital technologies throughout the product lifecycle. These technologies can provide decision-making support to manufacturers through improved monitoring, analysis, modeling, and simulation that generate more and better intelligence about manufacturing systems. However, challenges and barriers have impeded the adoption of smart manufacturing technologies. To begin to address this need, this paper defines requirements for data-driven decision making in manufacturing based on a generalized description of decision making. Using these requirements, we then focus on identifying key barriers that prevent the development and use of data-driven decision making in industry as well as examples of technologies and standards that have the potential to overcome these barriers. The goal of this research is to promote a common understanding among the manufacturing community that can enable standardization efforts and innovation needed to continue adoption and use of smart manufacturing technologies.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A036. doi:10.1115/DETC2016-59941.

This paper presents a graph grammar based automated tool that can generate bond graphs of various systems for dynamic analysis. A generic graph grammar based representation scheme has been developed for different system components and bond graph elements. Using that representation, grammar rules have been developed that enable interpreting a given system and generating bond graph through an algorithmic search process. Besides, the paper also demonstrates the utility of the proposed tool in classrooms to enhance value in bond graph based system dynamics education. The underlying technique, various examples and benefits of this automated tool will be highlighted.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A037. doi:10.1115/DETC2016-59988.

Search engine efficiency is an essential prerequisite to ensure a satisfactory on-line purchasing experience. Despite powerful tools available today, search engine is limited to a semantic elaboration of keywords and they do not allow users finding product categories that do not belong to their knowledge sphere. In this context, in order to make an effective search engine it is necessary to provide tools able to understand what the user is looking for and suggest the products that best satisfy their needs, regardless of users’ background.

To this aim, this paper proposes an innovative smart search strategy, based on artificial intelligence technologies. In order to highlight the system potential, the smart object market case study has been considered. The SOs market is grown so quickly to disorient the average user and it offer a wide variety of products apparently similar, but that are characterized by different features that the average user fails to perceive.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A038. doi:10.1115/DETC2016-60422.

A fundamental challenge for engineers when applying the MBSE (Model-based Systems Engineering) is to analyze complex design problems in an effective way and to represent a complex system with complete models. Aiming at providing an efficient design problem analyzing process and general modeling framework, this paper proposed an Axiomatic Design Theory (ADT) based System Engineering approach by mapping design domains, design axioms and a “Z” mapping process of ADT to complex system’s design processes, modeling and control processes and information loops respectively. Then a MBSE modeling framework that includes the Requirement Model, Functional Architecture and Physical Implementation Architecture was constructed. The presented method and modeling framework is able to guide system engineers to capture design problems effectively. After that, a case study was illustrated to demonstrate how the presented approach was applied. Last, the paper gave conclusions and future works that show lights on improving the MBSE modeling framework to function in a consistent way.

Commentary by Dr. Valentin Fuster

36th Computers and Information in Engineering Conference: VES: Interaction and Interfaces

2016;():V01BT02A039. doi:10.1115/DETC2016-59130.

Brain signal and eye tracking technology have been intensively applied in cognitive science in order to study reading, listening and learning processes. Though promising results have been found in laboratory experiments, there are no smart reading aids that are capable to estimate difficulty during normal reading. This paper presents a new concept that aims to tackle this challenge. Based on a literature study and an experiment, we have identified several indicators for characterizing word processing difficulty by interpreting electroencelography (EEG) and electrooculography (EOG) signals. We have defined a computational model based on fuzzy set theory, which estimates the probability of word processing and comprehension difficulty during normal reading. The paper also presents a concept and functional prototype of a smart reading aid, which is used to demonstrate the feasibility of our solution. The results of our research proves that it is possible to implement a smart reading aid that is capable to detect reading difficulty in real time. We show that the most reliable indicators are related to eye movement (i.e. fixation and regression), while brain signals are less dependable sources for indicating word processing difficulty during continuous reading.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A040. doi:10.1115/DETC2016-59505.

A product is chosen by users not only for the features it offers but also for the perceived experience of use. This statement widely recognized in literature highlight that key issues for the success of interactive products are the practical, experiential, affective, meaningful and valuable aspects of interaction.

In the last years, gesture-based interfaces have been introduced to make the experience of interaction more emotional, intuitive and natural. For this reason, the design and development of products integrated gesture-based interfaces represent a challenging issue.

In this context, a User-Centered Design (UCD) method to implement novel interaction paradigms into traditional consumer products is proposed. Its application in a real case study addresses the development and prototyping of a system exploiting gesture-based interaction to train aspiring conductors of orchestra. If a young musician wants to play a guitar, it is usually no great problem to find one, but if a musician wants to learn the ins and outs of being a conductor, the problem shifts to search for an electronic device managed by hands as used by conductors. The developed system provides three main functionalities: tempo control, velocity control and instruments activation. It represents both a use case to validate the proposed UCD method and an innovative solution in the context of aspiring conductor’s training.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A041. doi:10.1115/DETC2016-59886.

In design and engineering context, the use of tools, simulations and multi-realities is already an intrinsic part of design activities, methods and processes. To support participatory design during the ideation phase in a co-creative context, participative tools are needed. User-centered and co-creative design could benefit product creation and innovation process through data-collection (incl. product characteristics and user requirements) from individual data-mining activities.

The traditional approach for customer requirements prioritization is pair-wise comparison. It is used both in the QFD method and in the Pugh matrix method. In practice, this means that a user compares two product characteristics at a time and decides which one of the two is more important or if they are equally important. Determining a suitable user interface for the comparison has proven to be the most demanding phase in the implementation of this method.

This paper presents alternative ways to implement a customer property tool and discusses experiences with some of its implementations. In the first version, the interface is based on the use of numbers, whereas the last version is more visual, interactive and game-like. The feasibility of the tool was studied in user tests carried out in Finland and in the Netherlands.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A042. doi:10.1115/DETC2016-59956.

Hybrid Design Tool Environments (HDTE) allow designers and engineers to use real tangible tools and physical objects and/or artifacts to make and create real-time virtual representations and presentations on-the-fly. Manipulations of the real tangible objects (e.g., real wire mesh, clay, sketches, etc.) are translated into 2-D and/or 3-D digital CAD software and/or virtual instances. The HDTE is equipped with a Loosely Fitted Design Synthesizer (NXt-LFDS) to support this multi-user interaction and design processing. The current study explores for the first time, the feasibility of using a NXt-LFDS in a networked immersive multi-participant social virtual reality environment (VRE). Using Oculus Rift goggles and PC computers at each location linked via Skype, team members physically located in several countries had the illusion of being co-located in a single virtual world, where they used rawshaping technologies (RST) to design a woman’s purse in 3-D virtual representations. Hence, the possibility to print the purse out on the spot (i.e. anywhere within the networked loop) with a 2-D or 3D printer. Immersive affordable Virtual Reality (VR) technology (and 3-D AM) are in the process of becoming commercially available and widely used by mainstream consumers, a major development that could transform the collaborative design process. The results of the current feasibility study suggests that designing products may become considerably more individualized within collaborative multi-user settings and less inhibited during in the coming ‘Diamond Age’ [1] of VR, collaborative networks and with profound implications for the design (e.g. fashion) and engineering industry. This paper presents the proposed system architecture, a collaborative use-case scenario, and preliminary results of the interaction, coordination, cooperation, and communication with immersive VR.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A043. doi:10.1115/DETC2016-60063.

To effectively cooperate with a human, advanced manufacturing machines are expected to execute the industrial tasks according to human natural language (NL) instructions. However, NL instructions are not explicit enough to be understood and are not complete enough to be executed, leading to incorrected executions or even execution failure. To address these problems for better execution performance, we developed a Natural-Language-Instructed Task Execution (NL-Exe) method. In NL-Exe, semantic analysis is adopted to extract task-related knowledge, based on what human NL instructions are accurately understood. In addition, logic modeling is conducted to search the missing execution-related specifications, with which incomplete human instructions are repaired. By orally instructing a humanoid robot Baxter to perform industrial tasks “drill a hole” and “clean a spot”, we proved that NL-Exe could enable an advanced manufacturing machine to accurately understand human instructions, improving machine’s performance in industrial task execution.

Commentary by Dr. Valentin Fuster

36th Computers and Information in Engineering Conference: VES: General Technology for Augmented, Virtual and Mixed Reality

2016;():V01BT02A044. doi:10.1115/DETC2016-59286.

This paper reports development of a novel haptic 3D computer-based hip replacement simulator. A haptic device provides a kinesthetic interface in a virtual environment to conduct hip surgery. Predictive software enables modelling the risk of hip dislocation which was missing from previous simulators. The developed neural network autonomously matches compatible implant components from a library of industry standard part codes and sizes. The parameter driven simulator enables patient-specific modeling of femur and acetabulum. Combining haptic feedback with 3D graphics, the simulator enables training and assessment of orthopedic surgeons. The simulator includes haptic feedback for the orthopedic tools including reamer, saws, hip stems, acetabular cup implants. The hip replacement simulator allows surgeons to practice placing the stem and cup, providing a haptic sense of touch to replicate the in-vivo procedure. The novel capability to assess risk of dislocation could reduce post-operative dislocation. Enhancing the skill and accuracy of trainee hip surgeons can reduce the number of revision surgeries required, extend the life of artificial hip implants and improve patient safety, reducing costs for the health service.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A045. doi:10.1115/DETC2016-59997.

Large scale scene generation is a computationally intensive operation, and added complexities arise when dynamic content generation is required. We propose a system capable of generating virtual content from non-expert input. The proposed system uses a 3-dimensional variational autoencoder to interactively generate new virtual objects by interpolating between extant objects in a learned low-dimensional space, as well as by randomly sampling in that space. We present an interface that allows a user to intuitively explore the latent manifold, taking advantage of the network’s ability to perform algebra in the latent space to help infer context and generalize to previously unseen inputs.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A046. doi:10.1115/DETC2016-60401.

This paper presents a toolbox for rigid object tracking with a focus on augmented reality applications. Augmented reality relies on tracking to superimpose virtual objects on physical objects. Object tracking is usually based on registration and pose estimation techniques. Many different approaches have already been introduced. Our research focuses on tracking for application areas such as assembly assistance and the most promising candidate is rigid object tracking based on point cloud registration. Our work advances the robustness of point cloud-based tracking as well as the performance. One product of our research is our tracking tool TrackingExpert, which integrates all our research outcomes into one versatile software package. This paper introduces TrackingExpert covering functional areas such as the registration, visualizations, and experiment support. We also highlight several aspects which facilitate data analysis and ease our research.

Commentary by Dr. Valentin Fuster

36th Computers and Information in Engineering Conference: VES: Game Ecosystems in Engineering

2016;():V01BT02A047. doi:10.1115/DETC2016-59069.

This paper describes the vision and development of a tangible user interface (TUI) that allows ‘glassblowing-like’ interaction (IA) with a computer. The premise is that human fidelity in exerting pressure and airflow (i.e. breathing, blowing) could stimulate intuition, creative processing, and affords unconventional human-computer interaction (UHCI). The ultimate goal is to find out how the potential of the human body can be used to design, develop and analyze new spatial interaction methods that surpass performance or application possibilities of currently available techniques. Multi-modal interactions are essential to computational processing whereby the human and machine are interconnected and coupled to enhance skills (analogue and digital), support rich performance, facilitate learning and foster knowledge in design and engineering processing. This paper describes the key concept of the TUI, the graphical user interface (GUI) and the data visualizer system. We illustrate the concept with a prototype system — the Air-Flow-Interaction-Interface (AFIF), testing and experimentation — to identify underlying research issues.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A048. doi:10.1115/DETC2016-59201.

Health games are increasingly seen as a means to address issues from therapy and rehabilitation. Yet, as a transformative technology, rarely have such games been explored or exploited to assist research into pathologies. Serious games for research (SGR) to uncover pathologies would allow clinicians to develop new differential diagnostics while providing a positive experience for the subject. This paper is not about game design; nevertheless it presents an outlook to considerations that could be taken forward when developing health-based SGRs for pathomechanics, etiopathogenesis and biofeedback. This work relates to preliminary studies on balance challenges manifested in pathologies of the central nervous system. As technology advancements seek to augment human sensory contact between virtual and real worlds this may impact on how virtual environments are used and designed in future. As a consequence heightened sensory (or lack of thereof) may result in falls, for example users with vestibular disorder — because postural stability is a key aspect of motor ability that allows individuals to sustain and maintain the desired physical position of their body Here, our investigation is specific to functional correspondence of the incidental properties in human body sway between healthy subjects and subjects with dyslexia. Our early results suggest postural sway between healthy subjects and those with mild disorders can be distinguished.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A049. doi:10.1115/DETC2016-59202.

Current tools used to carry out design for manufacture and assembly (DFMA) evaluations are time consuming to use. This increases the cost of bringing a product to market by extending the length of the design process. The steps that comprise the design, analysis and redesign process are typically carried out in series over long periods of time using different tools. Introducing concurrent engineering practices could significantly reduce the time taken to complete this process and improve workplace DFMA learning. An opportunity exists to create and test an integrated real-time DFMA tool using the UNITY game engine, which could potentially address these problems. If this approach can be achieved it has the potential to decrease the time taken for the design process and enable a greater number of solutions to be considered potentially leading to a more optimal design solution. Having a more optimal design could lead to major cost reduction in later stages of product development by reducing the work needed to plan and carry out the manufacturing process and creating a product that is easier and less costly to maintain.

This paper reports on a pilot haptic sketch-based system to investigate its feasibility to conduct concurrent DFMA. .

Commentary by Dr. Valentin Fuster
2016;():V01BT02A050. doi:10.1115/DETC2016-59826.

In the education of mechanical engineers alternative learning methods like serious games, simulations etc. have been used in past decades to better the learning outcomes. However, as digital technologies advance, so too does the quality of commercial game-based learning. This brings the expectation that while serious games are still considered as an experimental pedagogic vehicle, the learning experience among students and their experience of using serious games become heightened. This is a challenge for several educational games that though fully able to progress a learning goal, is deemed detached due to its dated user interface and inability to host the latest ICTs. This creates an unappealing aspect to the student and can also affect their motivation. This paper reports on the early efforts to analyze serious games from the perspective of learning and gaming mechanics and the virtual environment and systems that can be made pervasive. The intention is to re-furbish dated serious games that are highly relevant to educating mechanical engineers. The proposed concepts lie in the adoption of new pervasive technologies enabled by cyber-physical systems (CPS) and Internet of things (IoT) to modernize dated engineering serious games.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A051. doi:10.1115/DETC2016-60061.

Research has highlighted Digital Games (DG)’ capacity to enhance skill and abilities through their persuasiveness and motivational appeal, which can support immersive, situated and user-centered experiences. DG development remains a challenge both in terms of costs and of the diverse range of advanced, multi-disciplinary expertise required to develop a DG. Developing DGs for such a complex domain as Mechanical Engineering (ME) to better equip engineering students to practice at the intersection of complex systems increases this challenge. An alternative to decrease costs is to capitalize on existing DGs. The paper analyzes opportunities for DG adaptation, in order to enable the reengineer of existing games to fit specific purposes and support knowledge transfer. The authors build upon current research and practices to construct an approach for adapting DG content. Two case studies are presented as a proof of concept to exemplify the different levels of the digital game reengineering process.

Commentary by Dr. Valentin Fuster

36th Computers and Information in Engineering Conference: Virtual Environments and Systems (VES General)

2016;():V01BT02A052. doi:10.1115/DETC2016-59756.

In this paper, we discuss methods to efficiently render stereoscopic scenes of large-scale point-clouds on inexpensive VR systems. Recently, terrestrial laser scanners are significantly improved, and they can easily capture tens of millions points in a short time from large fields, such as engineering plants. If 3D stereoscopic scenes of large-scale point-clouds could be easily rendered using inexpensive devices, they might be involved in casual product development phases. However, it is difficult to render a huge number of points using common PCs, because VR systems require high frame rates to avoid VR sickness. To solve this problem, we introduce an efficient culling method for large-scale point-clouds. In our method, we project all points onto angle-space panoramic images, whose axes are the azimuth and elevation angles of head directions. Then we eliminate occluded and redundant points according to the resolutions of devices. Once visible points are selected, they can be rendered in high frame rates. Visible points are updated when the user stays at a certain position to observe target objects. Since points are processed on image space in our method, preprocessing is very fast. In our experiments, our method could render stereoscopic views of large-scale point-clouds in high frame rates.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A053. doi:10.1115/DETC2016-59762.

Immersive virtual reality systems have the potential to transform the manner in which designers create prototypes and collaborate in teams. Using technologies such as the Oculus Rift or the HTC Vive, a designer can attain a sense of “presence” and “immersion” typically not experienced by traditional CAD-based platforms. However, one of the fundamental challenges of creating a high quality immersive virtual reality experience is actually creating the immersive virtual reality environment itself. Typically, designers spend a considerable amount of time manually designing virtual models that replicate physical, real world artifacts. While there exists the ability to import standard 3D models into these immersive virtual reality environments, these models are typically generic in nature and do not represent the designer’s intent. To mitigate these challenges, the authors of this work propose the real time translation of physical objects into an immersive virtual reality environment using readily available RGB-D sensing systems and standard networking connections. The emergence of commercial, off-the shelf RGB-D sensing systems such as the Microsoft Kinect, have enabled the rapid 3D reconstruction of physical environments. The authors present a methodology that employs 3D mesh reconstruction algorithms and real time rendering techniques to capture physical objects in the real world and represent their 3D reconstruction in an immersive virtual realilty environment with which the user can then interact. A case study involving a commodity RGB-D sensor and multiple computers connected through standard TCP internet connections is presented to demonstrate the viability of the proposed methodology.

Topics: Virtual reality
Commentary by Dr. Valentin Fuster
2016;():V01BT02A054. doi:10.1115/DETC2016-59840.

Observations of design review sequences by means of Virtual Reality (VR) and participant interviews have demonstrated that improvements may be made through providing selected user functions to participants who would otherwise just be observers. These functions are found from a survey of more or less experienced VR users in design reviews. The basic idea is that the participants are given access to special review data via App and smart device and are thus capable of immediately controlling the view of the product under review. Thus, for instance, the participants are able to download the structure tree including the components of the product to be tested onto their devices, navigate inside the structure tree, and get information about component parameters independently of the other participants.

Participants can also configure the representation of the VR model displayed for all participants via smart device (highlighting, visibility, colour, ...) and set up the view. This makes communication easier, since otherwise the respective settings have to be administered by a VR operator. Special user interaction functions may improve the control of design reviews by changing the participants’ role from passive to active.

A special requirement is the spatial selection of model components in the stereoscopic VR view. To meet this demand a Leap Motion controller transmits position data from the users hand to the pointer of the VR System via the protocol Virtual Reality Peripheral Network (VRPN).

Topics: Design
Commentary by Dr. Valentin Fuster
2016;():V01BT02A055. doi:10.1115/DETC2016-59872.

In this paper a virtual acoustic model for the simulation of noise by passing vehicles is described. The underlying motivation for the acoustic model is the increasing demand for realistic traffic and acoustic disturbance simulation methods. For this purpose, simulation tools are desired, that allow simulation of different kinds of traffic scenarios within a Virtual Reality (VR) environment. This includes a plausible and realistic presentation of the acoustic situation. The virtual vehicle model introduced in this paper is composed of several individual sound sources, e.g. for the tires, the engine and the exhaust system. Each sound source has its own directional characteristics. These characteristics are considered by sets of digital filters for each sound source. For that purpose, measurement series with a car on a dynamometer were performed.

Furthermore, in order to consider the environmental conditions a method for the description of sound reflections at simple objects (e.g. plane walls) is explained in this paper. The implemented concept offers the possibility to create individual mirror sources for every initial sound source and every object that can cause reflections. Since the concept is intended for the interactive application, the described method works in real time.

In order to evaluate the described virtual acoustic model, a test with volunteers was performed. These examinations are also described and discussed in this paper.

Commentary by Dr. Valentin Fuster
2016;():V01BT02A056. doi:10.1115/DETC2016-60205.

In a mixed human-robot team, adaptive automation methods can be used based on mental and cognitive states of human operators. Such adaptive behaviors can be designed such that lead to mitigation of human errors and consequently improvement of the task performance. However, real-time estimation of human internal states and their effects on the task performance remained a challenging issue and it has been the focus of many research in the recent years. Several studies have shown the capabilities of physiological feedbacks to assess human states in multi-tasking environments. In this paper, we present the early development of an experimental setup to investigate human physiological data during interaction with a small group of robotic agents. A simulated tele-exploration task is accomplished by participants and their brain activity and eye movements are recorded across the experiment. Statistical analysis are applied on the quantitative metrics to investigate the main effects and correlations between task performance and physiological features.

Topics: Robots , Design , Teams
Commentary by Dr. Valentin Fuster

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In