In this case, the autoencoder network is used to perform the task of dimensionality reduction, in combination with the parameters of a generative system (woolly threads), in order to visualize the high dimensional parameter space of these sort of system.
Self-organizing maps (SOM) are used here, (SOM represents a type of unsupervised artificial neural network), to reduce the high dimensionality of the data whilst also retain the high-dimensional non-linear associations. As SOMs can create associations between inputs, the map can suggest an overview of possibilities within the given parameter space, without the need of manually tweaking the system’s parameters by the designer.
The selected 7 models are used as inputs for the SOM. The resulting map creates an interpolation between inputs and creates a location rule, placing similar designs closer on the map while placing further apart dissimilar designs.
– cutoff parameter
– stiffness parameter
– numToMove parameter
– num of control points parameter
– color parameter
– root location
As in the previous post, because the dataset was relatively small, I used only CPU ( 2 x Intel Core i7-4930k @ 3.40GHz) computing for training phase. If you are dealing with a big data set, a GPU might be a better way to go to train your dataset.