Ultralytics
Created at | 22 Feb 2024 |
Editor(s) | Alejandro Carmena Magro , JD-017 Machine Learning Ops |
Supervisor(s) | Alfonso Medela , JD-005 Technical Manager & Person Responsible for Regulatory Compliance |
Approval | - Approved |
Description​
Ultralytics is a software organisation (but also the name of the library) that has developed and maintains state-of-the-art computer vision models. It is widely recognized for its efficient and accurate object detection capabilities, particularly with the YOLO (You Only Look Once) family of models. Ultralytics offers an easy-to-use interface for training, evaluating, and deploying object detection models, making it accessible for both research and industrial applications. The library also provides tools for model customisation and optimisation.
The latest iteration of the YOLO family is YOLOv8. It represents the cutting-edge in real-time object detection, offering significant improvements over its predecessors in terms of accuracy and speed.
General details​
- Developer(s): Ultralytics and other developers with minor contributions.
- Open source: Yes
- Language(s): Python
- Repository: https://github.com/ultralytics/ultralytics
- License: GNU General Public License 3.0
- Operating system(s): Cross-platform (Windows, Linux, macOS)
- Actively maintained: Yes (within the past week)
Intended use on the device​
The SOUP is used in the medical device for the following specific purposes only:
- Train state-of-the-art object detection models and use them as prediction services to count skin lesions in pathologies such as acne or hidradenitis suppurativa.
- Detection of human heads in clinical images where the lesion(s) are mainly in the head area and the head is only a small portion of the image.
Requirements​
For the integration and safe usage of this SOUP within a software system, it's important to outline both functional and performance requirements. These requirements help mitigate risks and ensure compatibility and performance standards are met.
Functional​
- Real-time object detection: Capable of performing real-time object detection with the ability to detect multiple object classes in various environments.
- Custom model training: Train custom models on proprietary datasets, enhancing the model's applicability to a wide range of specific tasks.
- Hyperparameter flexibility: Allow for a broad range of customisable hyperparameters to fine-tune model performance.
- Inference API: Provide a straightforward and flexible interface for making predictions. This involves a simple API for inputting data and retrieving detection results.
- Portability: Offer a versatile range of options for exporting the trained models to different formats (e.g., PyTorch, ONNX), making it deployable across various platforms and devices.
- Multi-GPU support: Ability to accelerate the training process seamlessly across multiple GPUs.
Performance​
- Accuracy: Maintain high detection accuracy, with a low false positive and false negative rate across diverse datasets.
- Speed: Support high-speed object detection, maintaining real-time performance even on systems with limited computing resources.
- Resource utilization: Efficient use of system resources (CPU/GPU, memory) is essential, ensuring it can run on a variety of hardware specifications, from high-end servers to edge devices.
- Scalability: The ability to scale based on the application's requirements, from low-resolution input for quick scanning to high-resolution input for detailed analysis.
System requirements​
Establishing minimum software and hardware requirements is important to mitigate risks, such as security vulnerabilities, performance issues, or compatibility problems, and to ensure that the SOUP functions effectively within the intended environment.
Software​
After evaluation, we find that there are no specific software requirements for this SOUP. It works properly on standard computing devices, which includes our environment.
Hardware​
After evaluation, we find that there are no specific hardware requirements for this SOUP. It works properly on standard computing devices, which includes our environment.
Documentation​
The official SOUP documentation can be found at https://docs.ultralytics.com/.
Additionally, a criterion for validating the SOUP is that all the items of the following checklist are satisfied:
- The vendor maintains clear and comprehensive documentation of the SOUP describing its functional capabilities, user guidelines, and tutorials, which facilitates learning and rapid adoption.
- The documentation for the SOUP is regularly updated and clearly outlines every feature utilized by the medical device, doing so for all integrated versions of the SOUP.
Related software items​
We catalog the interconnections between the microservices within our software architecture and the specific versions of the SOUP they utilise. This mapping ensures clarity and traceability, facilitating both the understanding of the system's dependencies and the management of SOUP components.
Although the title of the section mentions software items, the relationship with SOUP versions has been established with microservices (also considered software items, by the way) because each one is inside a different Docker container and, therefore, has its own isolated runtime environment.
SOUP version | Software item(s) |
---|---|
8.1.34 | ALADIN AUAS AIHS4 |
Related risks​
The following are risks applicable to this SOUP from the table found in document R-TF-013-002 Risk management record_2023_001
:
- 58. SOUP presents an anomaly that makes it incompatible with other SOUPs or with software elements of the device.
- 59. SOUP is not being maintained nor regularly patched.
- 60. SOUP presents cybersecurity vulnerabilities.
Lists of published anomalies​
The incidents, anomalies, known issues or changes between versions for this SOUP can be found at:
History of evaluation of SOUP anomalies​
29 Feb 2024​
- Reviewer of the anomalies: Alejandro Carmena Magro
- Version(s) of the SOUP reviewed: 8.1.34
No anomalies have been found.
Record signature meaning​
- Author: JD-004
- Reviewer: JD-003
- Approver: JD-005