6GCVAE

6GCVAE encodes IPv6 addresses as token sequences, passes them through gated convolutional encoder/decoder networks, and samples from the latent space to synthesise new addresses. Kullback–Leibler regularisation steers the latent distribution toward a smooth manifold for exploration.

  • Reference: Placeholder citation (coming soon).

Train

rmap train --seeds seeds/hitlist.txt --output models/6gcvae.bin six-gcvae \
  --latent-dim 32 --hidden-dim 128 --epochs 100

Generate

rmap generate --model models/6gcvae.bin --count 100000 \
  --output 6gcvae.txt

Configuration

  • --seed <u64> – RNG seed for training and generation (default 42).
  • --latent-dim <usize> – dimensionality of the VAE latent space (default 32).
  • --hidden-dim <usize> – width of convolutional layers (default 128).
  • --epochs <usize> – training epochs before early stopping (default 100).

Model notes

  • The implementation currently caps epochs internally for demo builds; expect truncated training loops when running the default configuration.
  • Serialized models store tokenizer metadata and network parameters so the generator can reproduce the same sampling behaviour offline.