Skip to content

Comments

Move Cython random number generation to C++ mechanism#1745

Open
Legend101Zz wants to merge 16 commits intobrian-team:masterfrom
Legend101Zz:feat/rng-cython
Open

Move Cython random number generation to C++ mechanism#1745
Legend101Zz wants to merge 16 commits intobrian-team:masterfrom
Legend101Zz:feat/rng-cython

Conversation

@Legend101Zz
Copy link
Contributor

@Legend101Zz Legend101Zz commented Dec 17, 2025

Fixes #1664

Random Number Generation Update (Runtime Mode)

Random number generation in runtime mode now uses the same C++ algorithm as C++ standalone mode. As a result:

  • Simulations run with seed(X) will now produce identical results in both runtime and standalone modes.
  • Simulations using the same seed will produce different results compared to earlier Brian2 versions when run in runtime mode.
  • C++ standalone mode results remain unchanged.
  • The get_random_state() and set_random_state() methods now return simplified values, making them easier to use.

cc: @mstimberg

Copy link
Member

@mstimberg mstimberg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Many thanks for this PR – I'm not 100% sure about the approach to do (and store/restore) seeds, but let's discuss this during our meeting 😊

@Legend101Zz
Copy link
Contributor Author

@mstimberg sorry for the long wait on this , finally got back to it and now tried the Pre-compiled RNG approch to prevent each code object from having it's own RNG , let's see if this works :)

@Legend101Zz
Copy link
Contributor Author

@mstimberg finally all tests pass :)

@Legend101Zz Legend101Zz requested a review from mstimberg January 24, 2026 18:30
Copy link
Member

@mstimberg mstimberg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was in the process of updating the documentation when I realized something, and I have to think more about how to handle this. With this PR, Cython's random number generation is basically identical to C++ standalone mode, but there is one difference: using the seed function means setting both numpy's seed and the Cython RNG seed, whereas standalone mode will not set numpy's seed. This means, calling e.g. seed(1234) in Cython runtime mode is equivalent to the following in C++ standalone mode:

np.random.seed(1234)
seed(1234)

This is actually something that I've sometimes seen in code, but it is not a good thing, since it means that the two sources of randomness use identical random number streams. It's not something that you'd usually notice, but introducing these correlations isn't great. In an – admittedly contrived – example:

G = NeuronGroup(10, "v : 1")
np.random.seed(1234)
seed(1234)
G.v = np.random.rand(10)
G.run_regularly("v -= rand()")
run(defaultclock.dt)

Here, the initial values and the random values subtracted from v in the first time step will be identical, leading to the value of 0 for v in all neurons… Not sure how to best handle this, I will think about this some more (and/or maybe open a discussion about it on the discourse forum).

@Legend101Zz
Copy link
Contributor Author

I was in the process of updating the documentation when I realized something, and I have to think more about how to handle this. With this PR, Cython's random number generation is basically identical to C++ standalone mode, but there is one difference: using the seed function means setting both numpy's seed and the Cython RNG seed, whereas standalone mode will not set numpy's seed. This means, calling e.g. seed(1234) in Cython runtime mode is equivalent to the following in C++ standalone mode:

np.random.seed(1234)
seed(1234)

This is actually something that I've sometimes seen in code, but it is not a good thing, since it means that the two sources of randomness use identical random number streams. It's not something that you'd usually notice, but introducing these correlations isn't great. In an – admittedly contrived – example:

G = NeuronGroup(10, "v : 1")
np.random.seed(1234)
seed(1234)
G.v = np.random.rand(10)
G.run_regularly("v -= rand()")
run(defaultclock.dt)

Here, the initial values and the random values subtracted from v in the first time step will be identical, leading to the value of 0 for v in all neurons… Not sure how to best handle this, I will think about this some more (and/or maybe open a discussion about it on the discourse forum).

@mstimberg thanks for sharing this is really interesting. One thought is to add an explicit parameter, something like:

def seed(self, seed=None, include_numpy=False):

so users can opt into seeding NumPy as well (possibly with a warning).

The main concern I see is backward compatibility: existing code that relied on seed() implicitly seeding NumPy would suddenly start producing different results, and that change would happen silently.

That’s why I’m wondering if a staged (two-step) approach might work better.

Instead of seeding NumPy with the raw seed S, we could seed it with a deterministic function of S, say f(S).

This keeps both RNGs reproducible while ensuring they’re statistically independent.

To avoid a silent breaking change, we could introduce this gradually with a deprecation-style parameter.

def seed(self, seed=None, include_numpy='deprecated'):
    from brian2.random.cythonrng import seed as rng_seed
    rng_seed(seed)

    if seed is not None:
        if include_numpy == 'deprecated':
            # STAGE 1: preserve current behavior but warn
            import numpy as np
            np.random.seed(seed)
            logger.warn(
                "Seeding NumPy with the same value as Brian2 is deprecated. "
                "In future versions, NumPy will use a derived independent seed."
            )

        elif include_numpy is True:
            # STAGE 2 / OPT-IN: derived deterministic seed
            numpy_seed = (seed + 0x9E3779B9) & 0xFFFFFFFF. 
            # note 0x9E3779B9 is a well-known constant (golden ratio × 2^32) used in hash functions
            import numpy as np
            np.random.seed(numpy_seed)

This way:

• Existing users get the same behavior (with a warning)
• New behavior is opt-in initially
• Eventually the derived seed can become default

also If we adopt the derived-seed approach, we could update the Standalone code generator to inject the same NumPy seeding logic into the generated Python script.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Move Cython random number generation to C++ mechanism

2 participants