new nn()
Main class for the creation of machine learning models. Acts as a thin client/proxy to the Web Worker where the actual models reside.
- Source:
Methods
(async, static) createModel(options) → {Promise.<ModelProxy>}
Create a machine learning model for hydrological prediction and analysis Supports 5 model architectures: Dense, LSTM, CNN, Transformer, Segmentation Models run in Web Worker for non-blocking performance
Parameters:
| Name | Type | Description | ||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
options |
Object | Function options Properties
|
- Source:
Returns:
Model proxy object for training and prediction
- Type
- Promise.<ModelProxy>
Examples
// 1. Dense Neural Network - Simple regression/classification
const denseModel = await hydro.analyze.nn.createModel({
params: { type: 'dense' },
args: {
inputShape: [5], // 5 input features
units: [32, 16, 1], // 3 layers: 32→16→1 neurons
activation: 'relu', // Activation function
outputActivation: 'linear' // Linear output for regression
}
});
// Use case: Predict streamflow from rainfall, temperature, etc.
// 2. LSTM - Time series forecasting
const lstmModel = await hydro.analyze.nn.createModel({
params: { type: 'lstm' },
args: {
timeSteps: 30, // 30 days lookback
features: 3, // 3 variables (precip, temp, flow)
units: [64, 32], // 2 LSTM layers
outputUnits: 1, // Predict 1 day ahead
dropout: 0.2 // 20% dropout for regularization
}
});
// Use case: Forecast streamflow 1 day ahead using 30-day history
// 3. CNN - Spatial pattern recognition
const cnnModel = await hydro.analyze.nn.createModel({
params: { type: 'cnn' },
args: {
inputShape: [64, 64, 3], // 64x64 RGB image
filters: [32, 64, 128], // 3 conv layers with increasing filters
kernelSize: 3, // 3x3 convolution kernels
poolSize: 2, // 2x2 max pooling
denseUnits: [128, 10], // Dense layers after convolution
activation: 'relu'
}
});
// Use case: Classify land cover from satellite imagery
// 4. Transformer - Advanced time series with attention
const transformerModel = await hydro.analyze.nn.createModel({
params: { type: 'transformer' },
args: {
timeSteps: 60, // 60-step sequence
features: 5, // 5 input features
numHeads: 4, // 4 attention heads
dModel: 128, // Model dimension
numLayers: 2, // 2 transformer blocks
dff: 256, // Feedforward dimension
outputUnits: 1
}
});
// Use case: Complex multivariate forecasting with attention mechanisms
// 5. Segmentation - Image segmentation (U-Net architecture)
const segModel = await hydro.analyze.nn.createModel({
params: { type: 'segmentation' },
args: {
inputShape: [256, 256, 3], // Input image size
filters: [64, 128, 256], // Encoder filters
numClasses: 3, // 3 classes (water, land, vegetation)
kernelSize: 3,
poolSize: 2
}
});
// Use case: Segment water bodies from aerial imagery
// Complete workflow: Create → Train → Predict
const model = await hydro.analyze.nn.createModel({
params: { type: 'lstm' },
args: { timeSteps: 7, features: 2, units: [32], outputUnits: 1 }
});
// Train the model
await model.train({
params: { epochs: 50, batchSize: 32 },
data: [trainingInputs, trainingOutputs]
});
// Make predictions
const predictions = await model.predict({ data: testInputs });
(async, static) loadModel(options) → {Promise.<ModelProxy>}
Load previously saved machine learning model Restores architecture and weights from storage Supports URL or IndexedDB loading
Parameters:
| Name | Type | Description | ||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
options |
Object | Loading options Properties
|
- Source:
Returns:
Loaded model proxy
- Type
- Promise.<ModelProxy>
Examples
// Load from URL
const model = await hydro.analyze.nn.loadModel({
params: { url: 'https://example.com/models/streamflow/model.json' }
});
const predictions = await model.predict({ data: testData });
// Load from IndexedDB
const model = await hydro.analyze.nn.loadModel({
params: { url: 'indexeddb://my-trained-model' }
});
// Complete save/load cycle
// Training session:
const model1 = await hydro.analyze.nn.createModel({/* config});
await model1.train({ data: [X, y], params: { epochs: 100 } });
await model1.save({ params: { name: 'flood-model', location: 'indexeddb' } });
// Later session:
const model2 = await hydro.analyze.nn.loadModel({
params: { url: 'indexeddb://flood-model' }
});
const results = await model2.predict({ data: newData });