pymepix.processing package

Subpackages

Submodules

pymepix.processing.acquisition module

pymepix.processing.baseacquisition module

Module deals with managing processing objects to form a data pipeline

class pymepix.processing.baseacquisition.AcquisitionPipeline(name, data_queue)[source]

Bases: object

Class that manages various stages

T = ~T
addStage(stage_number, pipeline_klass, *args, num_processes=1, **kwargs)[source]

Adds a stage to the pipeline

confStages(stages, longtime, use_event, event_window: tuple[float, float])[source]

Adds and configures multiple stages, including one UdpSampler, as well as every stage specified in the the ‘stages’ paramter. ‘stages’ shall be a dictionary with one key ‘class’, which contains as value either ‘pixel’ or ‘centroid’ and one key ‘num_processes’, which shall hold an integer as value

getConfigFromYaml(config_file: str = 'processing/pipelines.yaml')[source]

Loads pipeline config from yaml

getStage(stage_number) AcquisitionStage[source]
getStageByPipelineObject(T) AcquisitionStage[source]

Returns an Acquisition Stage with a Pipeline Object of the given type from this pipeline if available. Otherwise returns ‘None’

property isRunning
property numBlobProcesses

Number of python processes to spawn for centroiding if Centroid Calculator is used

Setting this will spawn the appropriate number of processes to perform centroiding. Changes take effect on next acquisition.

pipelineFactory(data_queue, longtime, name=None, use_event=None, event_window: tuple[float, float] = None)[source]

Creates an AcquisitionPipeline, configures it and returns it. Any parameters left as ‘None’ will be taken from the given configuration.

Parameters: use_event (boolean): If packets are forwarded to the centroiding. If True centroids are calculated.

reconfigure()[source]

Reconfigure every stage with the latest parameters it was configured with

property stages
start()[source]

Starts all stages

stop()[source]

Stops all stages

class pymepix.processing.baseacquisition.AcquisitionStage(stage, num_processes=1)[source]

Bases: object

Defines a single acquisition stage

Usually not created directly. Instead, it is created by AcquisitionPipeline Represent a single pipeline stage and handles management of queues and message passing as well as creation and destruction of processing objects.

Processes are not created until build() is called and do not run until start() is called

Parameters:

stage (int) – Initial position in the pipeline, lower stages are executed first

build(input_queue=None, output_queue=None)[source]
configureStage(pipeline_klass, *args, **kwargs)[source]

Configures the stage with a particular processing class

Parameters:
  • pipeline_klass (BasePipeline) – A pipeline class object

  • *args – positional arguments to pass into the class init

  • **kwargs – keyward arguments to pass into the class init

kill_process(pid: int, port)[source]
property numProcess

Number of processes to spawn when built

Parameters:

value (int) – Number of processes to spawn when acquisition starts

Returns:

Number of processes

Return type:

int

property outputQueue
property processes
setArgs(*args, **kwargs)[source]
property stage

Current position in the pipeline

start()[source]
startTrainID()[source]
stop(force=False)[source]
stopTrainID()[source]
pymepix.processing.baseacquisition.main()[source]

pymepix.processing.basepipeline module

Base implementation of objects relating to the processing pipeline

class pymepix.processing.basepipeline.BasePipelineObject(name, processing_step: ProcessingStep, input_queue=None, create_output=True, num_outputs=1, shared_output=None, propogate_input=True)[source]

Bases: Process

Base class for integration in a processing pipeline

Parameters:
  • name (str) – Name used for logging

  • input_queue (multiprocessing.Queue, optional) – Data queue to perform work on (usually) from previous step in processing pipeline

  • create_output (bool, optional) – Whether this creates its own output queue to pass data, ignored if (Default: True)

  • num_outputs (int,optional) – Used with create_output, number of output queues to create (Default: 1)

  • shared_output (multiprocessing.Queue, optional) – Data queue to pass results into, useful when multiple processes can put data into the same queue (such as results from centroiding). Ignored if create_output is True (Default: None)

  • propogate_input (bool) – Whether the input data should be propgated further down the chain

property enable

Enables processing

Determines whether the class will perform processing, this has the result of signalling the process to terminate. If there are objects ahead of it then they will stop receiving data if an input queue is required then it will get from the queue before checking processing This is done to prevent the queue from growing when a process behind it is still working

Parameters:

value (bool) – Enable value

Returns:

Whether the process is enabled or not

Return type:

bool

classmethod hasOutput()[source]

Defines whether this class can output results or not, e.g. Centroiding can output results but file writing classes do not

Returns:

Whether results are generated

Return type:

bool

property outputQueues

Exposes the outputs so they may be connected to the next step

Returns:

All of the outputs

Return type:

list of multiprocessing.Queue

post_run()[source]

Function called after main processing loop

pre_run()[source]

Function called before main processing loop

process(data_type=None, data=None)[source]

Main processing function, override this do perform work

To perform work within the pipeline, a class must override this function. General guidelines include, check for correct data type, and must return None for both if no output is given.

pushOutput(data_type, data)[source]

Pushes results to output queue (if available)

Parameters:
  • data_type (int) – Identifier for data type (see MeesageType for types)

  • data (any) – Results from processing (must be picklable)

run()[source]

Method to be run in sub-process; can be overridden in sub-class

pymepix.processing.basepipeline.main()[source]

pymepix.processing.datatypes module

Defines data that is passed between processing objects

class pymepix.processing.datatypes.MessageType(*values)[source]

Bases: IntEnum

Defines the type of message that is being passed into a multiprocessing queue

CentroidData = 3

Centroided Data

CloseFileCommand = 5

Close File Message

EventData = 2

Event Data

OpenFileCommand = 4

Open File message

PixelData = 1

Decoded Pixel/Trigger Data

RawData = 0

Raw UDP packets

TriggerData = 8

Decoded Triggers

pymepix.processing.pipeline_centroid_calculator module

Processors relating to centroiding

class pymepix.processing.pipeline_centroid_calculator.PipelineCentroidCalculator(processing_step: ~pymepix.processing.logic.processing_step.ProcessingStep = <pymepix.processing.logic.centroid_calculator.CentroidCalculator object>, input_queue=None, create_output=True, num_outputs=1, shared_output=None)[source]

Bases: BasePipelineObject

Performs centroiding on EventData received from Packet processor

process(data_type=None, data=None)[source]

Main processing function, override this do perform work

To perform work within the pipeline, a class must override this function. General guidelines include, check for correct data type, and must return None for both if no output is given.

pymepix.processing.pipeline_packet_processor module

class pymepix.processing.pipeline_packet_processor.PipelinePacketProcessor(processing_step: ~pymepix.processing.logic.processing_step.ProcessingStep = <pymepix.processing.logic.packet_processor.PacketProcessor object>, input_queue=None, create_output=True, num_outputs=1, shared_output=None)[source]

Bases: BasePipelineObject

Processes Pixel packets for ToA, ToT, triggers and events

This class, creates a UDP socket connection to SPIDR and receives the UDP packets from Timepix It then pre-processes them and sends them off for more processing

init_new_process()[source]
process(data_type=None, data=None)[source]

Main processing function, override this do perform work

To perform work within the pipeline, a class must override this function. General guidelines include, check for correct data type, and must return None for both if no output is given.

pymepix.processing.rawfilesampler module

pymepix.processing.rawtodisk module

class pymepix.processing.rawtodisk.Raw2Disk(context=None)[source]

Bases: object

Class for asynchronously writing raw files Intended to allow writing of raw data while minimizing impact on UDP reception reliability.

close(socket)[source]

Close the file currently in progress. call in main below

open_file(socket, filename)[source]

Creates a file with a given filename and path.

this doesn’t work anylonger using 2 sockets for the communication functionality needs to be put outside where you have access to the socket

write(data)[source]

Writes data to the file. Parameter is buffer type (e.g. bytearray or memoryview)

Not sure how useful this function actually is… It completes the interface for this class but from a performance point of view it doesn’t improve things. How could this be benchmarked?

pymepix.processing.rawtodisk.main()[source]
pymepix.processing.rawtodisk.main_process()[source]

seperate process not strictly necessary, just to double check if this also works with multiprocessing doesn’t work for debugging

pymepix.processing.udpsampler3 module

class pymepix.processing.udpsampler3.UdpSampler(longtime, chunk_size=10000, flush_timeout=0.3, input_queue=None, create_output=True, num_outputs=1, shared_output=None)[source]

Bases: Process

Recieves udp packets from SPDIR

This class, creates a UDP socket connection to SPIDR and recivies the UDP packets from Timepix It them pre-processes them and sends them off for more processing

property close_file
create_socket_connection()[source]
property enable

Enables processing

Determines whether the class will perform processing, this has the result of signalling the process to terminate. If there are objects ahead of it then they will stop receiving data if an input queue is required then it will get from the queue before checking processing This is done to prevent the queue from growing when a process behind it is still working

Parameters:

value (bool) – Enable value

Returns:

Whether the process is enabled or not

Return type:

bool

get_useful_packets(packet)[source]
init_new_process()[source]

create connections and initialize variables in new process

property outfile_name
post_run()[source]

method get’s called either at the very end of the process live or if there’s a socket timeout and raw2disk file should be closed

pre_run()[source]

init stuff which should only be available in new process

property record

Enables saving data to disk

Determines whether the class will perform processing, this has the result of signalling the process to terminate. If there are objects ahead of it then they will stop recieving data if an input queue is required then it will get from the queue before checking processing This is done to prevent the queue from growing when a process behind it is still working

Parameters:

value (bool) – Enable value

Returns:

Whether the process should record and write to disk or not

Return type:

bool

run()[source]

method which is executed in new process via multiprocessing.Process.start

pymepix.processing.udpsampler3.main()[source]
pymepix.processing.udpsampler3.send_data(packets, chunk_size, start=0, sleep=0.0001)[source]

pymepix.processing.udpsampler4 module

pymepix.processing.usbtrainid module

class pymepix.processing.usbtrainid.USBTrainID(name='USBTrainId')[source]

Bases: Process

Class for asynchronously writing raw files Intended to allow writing of raw data while minimizing impact on UDP reception reliability

connect_device(device)[source]

Establish connection to USB device

run()[source]

Method to be run in sub-process; can be overridden in sub-class

pymepix.processing.usbtrainid.main()[source]

Module contents