PACS System 0.1.0
PACS DICOM system library
Loading...
Searching...
No Matches
kcenon::pacs::ai::ai_service_connector Class Reference

Connector for external AI inference services. More...

#include <ai_service_connector.h>

Collaboration diagram for kcenon::pacs::ai::ai_service_connector:
Collaboration graph

Public Types

using status_callback = std::function<void(const inference_status&)>
 Callback type for status updates.
 
using completion_callback
 Callback type for completion notification.
 

Static Public Member Functions

static auto initialize (const ai_service_config &config) -> Result< std::monostate >
 Initialize the AI service connector.
 
static void shutdown ()
 Shutdown the AI service connector.
 
static auto is_initialized () noexcept -> bool
 Check if the connector is initialized.
 
static auto request_inference (const inference_request &request) -> Result< std::string >
 Request AI inference for a study.
 
static auto check_status (const std::string &job_id) -> Result< inference_status >
 Check the status of an inference job.
 
static auto cancel (const std::string &job_id) -> Result< std::monostate >
 Cancel an inference job.
 
static auto wait_for_completion (const std::string &job_id, std::chrono::milliseconds timeout=std::chrono::minutes{30}, status_callback callback=nullptr) -> Result< inference_status >
 Wait for a job to complete.
 
static auto list_active_jobs () -> Result< std::vector< inference_status > >
 List active inference jobs.
 
static auto list_models () -> Result< std::vector< model_info > >
 List available AI models.
 
static auto get_model_info (const std::string &model_id) -> Result< model_info >
 Get information about a specific model.
 
static auto check_health () -> bool
 Check AI service health.
 
static auto get_latency () -> std::optional< std::chrono::milliseconds >
 Get current latency to the AI service.
 
static auto get_config () -> const ai_service_config &
 Get the current configuration.
 
static auto update_credentials (authentication_type auth_type, const std::string &credentials) -> Result< std::monostate >
 Update authentication credentials.
 

Private Member Functions

 ai_service_connector ()=delete
 
 ~ai_service_connector ()=delete
 
 ai_service_connector (const ai_service_connector &)=delete
 
ai_service_connectoroperator= (const ai_service_connector &)=delete
 

Static Private Attributes

static std::unique_ptr< impl > pimpl_
 

Detailed Description

Connector for external AI inference services.

This class provides a unified interface for interacting with external AI inference services, including:

  • Sending DICOM studies for AI processing
  • Tracking inference job status
  • Cancelling running jobs
  • Listing available AI models

The connector uses network_system for HTTP communication and integrates with logger_system for audit logging and monitoring_system for metrics.

Thread Safety: All methods are thread-safe.

Definition at line 311 of file ai_service_connector.h.

Member Typedef Documentation

◆ completion_callback

Initial value:
std::function<void(const std::string& job_id,
bool success,
const std::vector<std::string>& result_uids)>

Callback type for completion notification.

Examples
/home/runner/work/pacs_system/pacs_system/include/kcenon/pacs/ai/ai_service_connector.h.

Definition at line 321 of file ai_service_connector.h.

◆ status_callback

Constructor & Destructor Documentation

◆ ai_service_connector() [1/2]

kcenon::pacs::ai::ai_service_connector::ai_service_connector ( )
privatedelete

◆ ~ai_service_connector()

kcenon::pacs::ai::ai_service_connector::~ai_service_connector ( )
privatedelete

◆ ai_service_connector() [2/2]

kcenon::pacs::ai::ai_service_connector::ai_service_connector ( const ai_service_connector & )
privatedelete

Member Function Documentation

◆ cancel()

static auto kcenon::pacs::ai::ai_service_connector::cancel ( const std::string & job_id) -> Result< std::monostate >
staticnodiscard

Cancel an inference job.

Attempts to cancel a pending or running job. Jobs that have already completed cannot be cancelled.

Parameters
job_idThe job identifier to cancel
Returns
Result indicating success or failure
Examples
/home/runner/work/pacs_system/pacs_system/include/kcenon/pacs/ai/ai_service_connector.h.

◆ check_health()

static auto kcenon::pacs::ai::ai_service_connector::check_health ( ) -> bool
staticnodiscard

Check AI service health.

Returns
true if the service is healthy and accessible
Examples
/home/runner/work/pacs_system/pacs_system/include/kcenon/pacs/ai/ai_service_connector.h.

◆ check_status()

static auto kcenon::pacs::ai::ai_service_connector::check_status ( const std::string & job_id) -> Result< inference_status >
staticnodiscard

Check the status of an inference job.

Parameters
job_idThe job identifier returned from request_inference
Returns
Result containing current status on success
Examples
/home/runner/work/pacs_system/pacs_system/include/kcenon/pacs/ai/ai_service_connector.h.

◆ get_config()

static auto kcenon::pacs::ai::ai_service_connector::get_config ( ) -> const ai_service_config &
staticnodiscard

Get the current configuration.

Returns
Current AI service configuration
Examples
/home/runner/work/pacs_system/pacs_system/include/kcenon/pacs/ai/ai_service_connector.h.

◆ get_latency()

static auto kcenon::pacs::ai::ai_service_connector::get_latency ( ) -> std::optional< std::chrono::milliseconds >
staticnodiscard

Get current latency to the AI service.

Returns
Round-trip time to the service, or nullopt if unavailable
Examples
/home/runner/work/pacs_system/pacs_system/include/kcenon/pacs/ai/ai_service_connector.h.

◆ get_model_info()

static auto kcenon::pacs::ai::ai_service_connector::get_model_info ( const std::string & model_id) -> Result< model_info >
staticnodiscard

Get information about a specific model.

Parameters
model_idThe model identifier
Returns
Result containing model information
Examples
/home/runner/work/pacs_system/pacs_system/include/kcenon/pacs/ai/ai_service_connector.h.

◆ initialize()

static auto kcenon::pacs::ai::ai_service_connector::initialize ( const ai_service_config & config) -> Result< std::monostate >
staticnodiscard

Initialize the AI service connector.

Must be called before any other operations. Sets up HTTP client, configures authentication, and validates connection.

Parameters
configConfiguration options
Returns
Result indicating success or initialization error
Examples
/home/runner/work/pacs_system/pacs_system/include/kcenon/pacs/ai/ai_service_connector.h.

◆ is_initialized()

static auto kcenon::pacs::ai::ai_service_connector::is_initialized ( ) -> bool
staticnodiscardnoexcept

Check if the connector is initialized.

Returns
true if initialized, false otherwise
Examples
/home/runner/work/pacs_system/pacs_system/include/kcenon/pacs/ai/ai_service_connector.h.

◆ list_active_jobs()

static auto kcenon::pacs::ai::ai_service_connector::list_active_jobs ( ) -> Result< std::vector< inference_status > >
staticnodiscard

List active inference jobs.

Returns all jobs that are currently pending or running.

Returns
Result containing list of active job statuses
Examples
/home/runner/work/pacs_system/pacs_system/include/kcenon/pacs/ai/ai_service_connector.h.

◆ list_models()

static auto kcenon::pacs::ai::ai_service_connector::list_models ( ) -> Result< std::vector< model_info > >
staticnodiscard

List available AI models.

Returns
Result containing list of available models
Examples
/home/runner/work/pacs_system/pacs_system/include/kcenon/pacs/ai/ai_service_connector.h.

◆ operator=()

ai_service_connector & kcenon::pacs::ai::ai_service_connector::operator= ( const ai_service_connector & )
privatedelete

◆ request_inference()

static auto kcenon::pacs::ai::ai_service_connector::request_inference ( const inference_request & request) -> Result< std::string >
staticnodiscard

Request AI inference for a study.

Submits a study for AI processing and returns a job ID for tracking.

Parameters
requestInference request parameters
Returns
Result containing job ID on success, or error on failure
Note
The study must be accessible to the AI service (either via DICOM C-MOVE or DICOMweb WADO-RS)
Examples
/home/runner/work/pacs_system/pacs_system/include/kcenon/pacs/ai/ai_service_connector.h.

◆ shutdown()

static void kcenon::pacs::ai::ai_service_connector::shutdown ( )
static

Shutdown the AI service connector.

Cancels pending requests and releases resources.

Examples
/home/runner/work/pacs_system/pacs_system/include/kcenon/pacs/ai/ai_service_connector.h.

◆ update_credentials()

static auto kcenon::pacs::ai::ai_service_connector::update_credentials ( authentication_type auth_type,
const std::string & credentials ) -> Result< std::monostate >
staticnodiscard

Update authentication credentials.

Parameters
auth_typeNew authentication type
credentialsNew credentials (API key, token, or username:password)
Returns
Result indicating success or failure
Examples
/home/runner/work/pacs_system/pacs_system/include/kcenon/pacs/ai/ai_service_connector.h.

◆ wait_for_completion()

static auto kcenon::pacs::ai::ai_service_connector::wait_for_completion ( const std::string & job_id,
std::chrono::milliseconds timeout = std::chrono::minutes{30},
status_callback callback = nullptr ) -> Result< inference_status >
staticnodiscard

Wait for a job to complete.

Blocks until the job completes, fails, or times out.

Parameters
job_idThe job identifier to wait for
timeoutMaximum time to wait
status_callbackOptional callback for status updates
Returns
Result containing final status on completion
Examples
/home/runner/work/pacs_system/pacs_system/include/kcenon/pacs/ai/ai_service_connector.h.

Member Data Documentation

◆ pimpl_

std::unique_ptr<impl> kcenon::pacs::ai::ai_service_connector::pimpl_
staticprivate

The documentation for this class was generated from the following file: