LiveNodePacket
QuantConnect.Packets.LiveNodePacket
LiveNodePacket()
Bases: AlgorithmNodePacket
Live job task packet: container for any live specific job variables
Default constructor for JSON of the Live Task Packet
deploy_id
deploy_id: str
Deploy Id for this live algorithm.
brokerage
brokerage: str
String name of the brokerage we're trading with
brokerage_data
brokerage_data: Dictionary[str, str]
String-String Dictionary of Brokerage Data for this Live Job
data_queue_handler
data_queue_handler: str
String name of the DataQueueHandler or LiveDataProvider we're running with
data_channel_provider
data_channel_provider: str
String name of the DataChannelProvider we're running with
disable_acknowledgement
disable_acknowledgement: bool
Gets flag indicating whether or not the message should be acknowledged and removed from the queue
notification_events
notification_events: HashSet[str]
A list of event types to generate notifications for, which will use notification_targets
live_data_types
live_data_types: HashSet[str]
List of real time data types available in the live trading environment
channel
channel: str
User unique specific channel endpoint to send the packets
python_virtual_environment
python_virtual_environment: str
Virtual environment ID used to find PythonEvironments Ideally MD5, but environment names work as well.
host_name
host_name: str
The host name to use if any
user_id
user_id: int
User Id placing request
user_token
user_token: str
organization_id
organization_id: str
project_id
project_id: int
Project Id of the request
project_name
project_name: str
Project name of the request
algorithm_id
algorithm_id: str
Algorithm Id - BacktestId or DeployId - Common Id property between packets.
session_id
session_id: str
User session Id for authentication
compile_id
compile_id: str
Unique compile id of this backtest
version
version: str
Version number identifier for the lean engine.
redelivered
redelivered: bool
An algorithm packet which has already been run and is being redelivered on this node. In this event we don't want to relaunch the task as it may result in unexpected behaviour for user.
algorithm
algorithm: List[int]
Algorithm binary with zip of contents
request_source
request_source: str
Request source - Web IDE or API - for controling result handler behaviour
ram_allocation
ram_allocation: int
The maximum amount of RAM (in MB) this algorithm is allowed to utilize
parameters
parameters: Dictionary[str, str]
The parameter values used to set algorithm parameters
history_provider
history_provider: str
String name of the HistoryProvider we're running with
get_algorithm_name
get_algorithm_name() -> str
Gets a unique name for the algorithm defined by this packet