Welcome Instructions Tutorial

Tutorial

Before following this tutorial, please have a look at the instructions for supported file types and formatting. This tutorial assumes that you are working in root directory where you cloned the repository. It is the directory where AIMCEG folder is created by git.

Importing the event generator

from AIMCEG.ThreeParticlesEvent import EventGenerator

Building the generator

generator = EventGenerator()

This will build the generator which we have trained and provided in the toolkit. It is trained on Proton, Pi_plus, X events. Now you can call generate method to generate the events.

generator.generate(100000)

Training the generator

Let’s start from scratch and build a generator to train on a different dataset.

generator = EventGenerator(filePath="eventData.csv", ignoreColumns=['M2pi', 'FIELD15'], computeParticle="pim", mass={"p":0.93827, "pip":0.1395, "pim": 0.1395})

In order to train your own generator, you need to provide the following things,

After you have successfully built the generator, it will print the information about the data for you to validate. An example output is shown below,

Training on: 
 Particles:  ['p', 'pip']
Particles' features:  ['x', 'y', 'z']
Other features:  ['gamma']

You can access the data it parsed with

print(generator.data)
print(generator.trainingFeaturesList)

to validate.

Now, it’s time to start the training!

generator.train()

This will start the training with default parameters. But you can pass in the following arguments to change them.

Once the training is finished the models will be saved in the current directory (or the directory you specified) in two folders named generator and discriminator.
You can generate new events by calling generate function as above. Now, they will represent distribution of the data you trained it on.

(Beta) Uncertainty Quantification

You can quantify the uncertainties in the trained generator by calling below function

generator.quantifyUncertainty()

It will show uncertainty plots for the first three features in your trainingFeaturesList. This is a beta function, still under development.

Lambda Layer

The generator tries to use physics informed Machine Learning at some level by utilizing a lambda layer. It always learns to generate the features of all the particles but one. This one particle’s features are computed based on features of the other particles so that law of conservation of momentum is not violated. Which particle to be computed by the labmda layer can be set set by passing computeParticle argument while building the generator.

(Beta) Custom Lambda Layer

You may want to use your own version of lambda layer to do some postprocessing on the generated features before they are fed to the discriminator during training. To implement your own version of lambda layer, you can create a generator class while inherriting our EventGenerator class and override the function named generatorLambdaLayer as shown below

class MyGenerator(EventGenerator):
    """
    """
    self.lambdaParams = [4,5,4,6]
    def __init__(self, filePath, ignoreColumns=[], computeParticle="", mass={}):
        # Optional learning rate scheduler
        self.lr_scheduler = scheduler = tf.keras.optimizers.schedules.ExponentialDecay(initial_learning_rate=1e-4,
                                                                                       decay_steps=10000,
                                                                                       decay_rate=0.9)
        super().__init__(filePath, ignoreColumns=ignoreColumns, computeParticle=computeParticle, mass=mass)
    
    def generatorLambdaLayer(self, x, params):
        <your lambda layer definition>

You can pass in some static elements (for instance mean, std) to your lambda layer if needed by setting self.lambdaParams variable as shown above. Make sure to initialize this variable before calling the constructor of the super class because it will be used there to pass in the model. You can access this variable in the lambda layer definition with name params as shown above.