In nonlinear lightwave systems, often the events of interest, i.e., ones that cause system errors, are large deviations of the pulses. We present a method to determine such large deviations by formulating a constrained optimization problem. We then show that the resulting optimization problem can be solved efficiently by exploiting the mathematical structure of the governing equations.
These results then guide importance-sampled Monte-Carlo simulations to determine the events' probabilities. The method applies to a general class of intensity-based optical detectors, to arbitrarily shaped and multiple pulses.