Point name automatic generation tool (for automatic generation of point names in initial ADF files), including model training tools
T5-Small is a lightweight natural language processing (NLP) model developed by Google. It adopts a unified "text-to-text" framework that can convert various NLP tasks into a form that receives text input and generates text output. This project uses the T5-small model for name generation. For more information about the T5-small model, refer to the documentation T5-small
# Install uv tool
curl -LsSf https://astral.sh/uv/install.sh | sh
# Check uv version
uv --version
# Install dependency tools (for building tools and downloading models)
sudo apt install -y bear make g++ git git-lfs btop
# Install all dependencies
uv sync
# For special environments, manually install dependencies as follows:
# pip install -r requirements.scnet.txt
# Place the developed ADF files in the .tmp/data/files directory
mkdir -p .tmp/data/files && cp -f xxx .tmp/data/files
# Run start.sh script to train the model
./start.sh
# Run start.sh script in the background (model training takes a long time, it's best to train in the background. You can check the running status through logs.)
# nohup ./start.sh >start.log 2>&1 &
# View running logs
# tail -f .tmp/logs/start_xxx.log
# Use btop tool to view network, CPU, and memory status
# btop
# The trained and packaged archive is the t5_model_package.tar.gz file in the project root directory-
requirements.scnet.txtcontains the dependency libraries that need to be installed in the container environment on the scnet platform, andstart.scnet.shis the startup script on the scnet platform.Containers on the scnet platform use heterogeneous acceleration cards. The environment has been adapted with modified torch and transformers libraries, so you can directly use the global environment.
On the scnet platform, use the
watch -n 0.2 rocm-smicommand to view the running status of heterogeneous acceleration cards. -
Other project-related documentation is stored in the
doc/directory