Fix(pipeline): optimize docker build, fix zip structure, and update UI
- Docker: - Explicitly install pixi environments (digger, pipeline, webbackend) during build to prevent runtime network/DNS failures. - Optimize pnpm config (copy method) to fix EAGAIN errors. - Backend: - Refactor ZIP bundling: use flat semantic directories (1_Toxin_Mining, etc.). - Fix "nested zip" issue by cleaning existing archives before bundling. - Exclude raw 'context' directory from final download. - Frontend: - Update TutorialView documentation to match new result structure. - Improve TaskMonitor progress bar precision (1 decimal place). - Update i18n (en/zh) for new file descriptions. Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
26
AGENTS.md
26
AGENTS.md
@@ -499,3 +499,29 @@ docker exec bttoxin-pipeline curl http://127.0.0.1:8000/api/health
|
|||||||
4. **Health Check Path**:
|
4. **Health Check Path**:
|
||||||
- **Cause**: Nginx routed `/health` to `/api/health` but backend expected `/health`.
|
- **Cause**: Nginx routed `/health` to `/api/health` but backend expected `/health`.
|
||||||
- **Fix**: Update Nginx config to proxy pass to correct endpoint.
|
- **Fix**: Update Nginx config to proxy pass to correct endpoint.
|
||||||
|
|
||||||
|
### Post-Mortem: Consistency Refactoring & Fixes (2026-01-20 Update)
|
||||||
|
|
||||||
|
**Summary:**
|
||||||
|
Major refactoring to ensure consistency between script execution and web pipeline, fix severe container startup failures, and simplify user experience.
|
||||||
|
|
||||||
|
**1. Unified Pipeline Execution**
|
||||||
|
- **Problem**: Web backend manually orchestrated pipeline steps, leading to discrepancies with the standalone script (e.g., missing plots, different file formats).
|
||||||
|
- **Fix**: Refactored `backend/app/workers/tasks.py` to directly subprocess `scripts/run_single_fna_pipeline.py`.
|
||||||
|
- **Result**: Web output is now guaranteed identical to manual script execution.
|
||||||
|
|
||||||
|
**2. Result Format & Cleanup**
|
||||||
|
- **Change**: Switched output format from `.tar.gz` to `.zip`.
|
||||||
|
- **Feature**: Added automatic cleanup of intermediate directories (`digger/`, `shoter/`) to save disk space; only the final ZIP and logs are retained.
|
||||||
|
- **Frontend**: Updated download logic to handle `.zip` files.
|
||||||
|
|
||||||
|
**3. Frontend Simplification**
|
||||||
|
- **Change**: Removed CRISPR Fusion UI elements (beta feature) to reduce complexity.
|
||||||
|
- **Change**: Replaced complex multi-stage status indicators with a "Simulated Progress Bar" for better UX during black-box script execution.
|
||||||
|
- **Fix**: Restored "One-click load" button and fixed TypeScript build errors caused by removed variables.
|
||||||
|
|
||||||
|
**4. Critical Docker Fixes**
|
||||||
|
- **Fix (Restart Loop)**: Removed incorrect `image: postgres` directive in `docker-compose.yml` that caused the web service to run database software instead of the app.
|
||||||
|
- **Fix (Env Path)**: Updated `.dockerignore` to exclude host `.pixi` directory, preventing "bad interpreter" errors caused by hardcoded host paths in the container.
|
||||||
|
- **Fix (404 Error)**: Removed erroneous `rm -rf /app/frontend` in Dockerfile that was accidentally deleting built frontend assets.
|
||||||
|
- **Optimization**: Configured `npmmirror` registry to resolve build timeouts in CN network environments.
|
||||||
|
|||||||
21
README.md
21
README.md
@@ -407,6 +407,27 @@ The setup uses Traefik for SSL termination and routing. The backend API and fron
|
|||||||
|
|
||||||
For detailed Docker deployment information, see [DOCKER_DEPLOYMENT.md](DOCKER_DEPLOYMENT.md)
|
For detailed Docker deployment information, see [DOCKER_DEPLOYMENT.md](DOCKER_DEPLOYMENT.md)
|
||||||
|
|
||||||
|
### Building the Image Manually
|
||||||
|
|
||||||
|
To build the image manually, ensure you set the correct build context so that `pixi.toml` can be found.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Option 1: From project root (specifying context)
|
||||||
|
docker build \
|
||||||
|
--network=host \
|
||||||
|
-f web/zly/docker/dockerfiles/Dockerfile.traefik \
|
||||||
|
-t hotwa/bttoxin-app:latest \
|
||||||
|
web/zly
|
||||||
|
|
||||||
|
# Option 2: Enter directory first
|
||||||
|
cd web/zly
|
||||||
|
docker build \
|
||||||
|
--network=host \
|
||||||
|
-f docker/dockerfiles/Dockerfile.traefik \
|
||||||
|
-t hotwa/bttoxin-app:latest \
|
||||||
|
.
|
||||||
|
```
|
||||||
|
|
||||||
## Troubleshooting
|
## Troubleshooting
|
||||||
|
|
||||||
### pixi not found
|
### pixi not found
|
||||||
|
|||||||
21
README_CN.md
21
README_CN.md
@@ -404,6 +404,27 @@ Docker 部署采用单容器模式,使用 Nginx 同时托管前端静态资源
|
|||||||
|
|
||||||
详细的 Docker 部署信息请参阅 [DOCKER_DEPLOYMENT.md](DOCKER_DEPLOYMENT.md)
|
详细的 Docker 部署信息请参阅 [DOCKER_DEPLOYMENT.md](DOCKER_DEPLOYMENT.md)
|
||||||
|
|
||||||
|
### 手动构建镜像
|
||||||
|
|
||||||
|
若要手动构建镜像,请确保设置正确的构建上下文(build context),以便 Docker 能找到 `pixi.toml`。
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# 选项 1:从项目根目录(指定上下文)
|
||||||
|
docker build \
|
||||||
|
--network=host \
|
||||||
|
-f web/zly/docker/dockerfiles/Dockerfile.traefik \
|
||||||
|
-t hotwa/bttoxin-app:latest \
|
||||||
|
web/zly
|
||||||
|
|
||||||
|
# 选项 2:先进入目录
|
||||||
|
cd web/zly
|
||||||
|
docker build \
|
||||||
|
--network=host \
|
||||||
|
-f docker/dockerfiles/Dockerfile.traefik \
|
||||||
|
-t hotwa/bttoxin-app:latest \
|
||||||
|
.
|
||||||
|
```
|
||||||
|
|
||||||
## 故障排除
|
## 故障排除
|
||||||
|
|
||||||
### 找不到 pixi
|
### 找不到 pixi
|
||||||
|
|||||||
@@ -126,42 +126,60 @@ def run_bttoxin_analysis(
|
|||||||
logger.info(f"Job {job_id}: Creating zip bundle")
|
logger.info(f"Job {job_id}: Creating zip bundle")
|
||||||
zip_path = output_path / f"pipeline_results_{job_id}.zip"
|
zip_path = output_path / f"pipeline_results_{job_id}.zip"
|
||||||
|
|
||||||
# 需要打包的子目录
|
# 在创建新 ZIP 前,删除目录下任何现有的 zip/tar.gz 文件,防止递归打包
|
||||||
subdirs_to_zip = ["digger", "shoter", "logs"]
|
for existing_archive in output_path.glob("*.zip"):
|
||||||
|
try:
|
||||||
|
existing_archive.unlink()
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
for existing_archive in output_path.glob("*.tar.gz"):
|
||||||
|
try:
|
||||||
|
existing_archive.unlink()
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# 定义映射关系:原始目录 -> 压缩包内展示名称
|
||||||
|
dir_mapping = {
|
||||||
|
"digger": "1_Toxin_Mining",
|
||||||
|
"shotter": "2_Toxicity_Scoring",
|
||||||
|
"logs": "Logs"
|
||||||
|
}
|
||||||
|
|
||||||
with zipfile.ZipFile(zip_path, 'w', zipfile.ZIP_DEFLATED) as zipf:
|
with zipfile.ZipFile(zip_path, 'w', zipfile.ZIP_DEFLATED) as zipf:
|
||||||
# 添加输入文件
|
# 1. 添加输入文件 (放入 Input 目录)
|
||||||
zipf.write(input_file, arcname=input_file.name)
|
zipf.write(input_file, arcname=f"Input/{input_file.name}")
|
||||||
|
|
||||||
# 添加结果目录
|
# 2. 添加结果目录 (重命名)
|
||||||
for subdir_name in subdirs_to_zip:
|
for src_name, dest_name in dir_mapping.items():
|
||||||
subdir_path = output_path / subdir_name
|
src_path = output_path / src_name
|
||||||
if subdir_path.exists():
|
if src_path.exists():
|
||||||
for root, dirs, files in os.walk(subdir_path):
|
for root, dirs, files in os.walk(src_path):
|
||||||
for file in files:
|
for file in files:
|
||||||
file_path = Path(root) / file
|
file_path = Path(root) / file
|
||||||
# 保持相对路径结构
|
# 排除压缩包自己(如果有意外情况)
|
||||||
arcname = file_path.relative_to(output_path)
|
if file_path == zip_path:
|
||||||
|
continue
|
||||||
|
|
||||||
|
# 计算相对路径,例如 digger/Results/foo.txt -> Results/foo.txt
|
||||||
|
rel_path = file_path.relative_to(src_path)
|
||||||
|
# 构造新的归档路径 -> 1_Toxin_Mining/Results/foo.txt
|
||||||
|
arcname = Path(dest_name) / rel_path
|
||||||
zipf.write(file_path, arcname=str(arcname))
|
zipf.write(file_path, arcname=str(arcname))
|
||||||
|
|
||||||
# 删除原始结果目录 (保留 logs 以便调试? 或者也删除)
|
# 删除原始结果目录
|
||||||
# 根据需求:只保留压缩包
|
|
||||||
logger.info(f"Job {job_id}: Cleaning up intermediate files")
|
logger.info(f"Job {job_id}: Cleaning up intermediate files")
|
||||||
for subdir_name in subdirs_to_zip:
|
# 需要清理的原始目录名
|
||||||
subdir_path = output_path / subdir_name
|
dirs_to_clean = ["digger", "shotter", "context", "logs", "stage"]
|
||||||
if subdir_path.exists():
|
for d in dirs_to_clean:
|
||||||
shutil.rmtree(subdir_path)
|
d_path = output_path / d
|
||||||
|
if d_path.exists():
|
||||||
|
shutil.rmtree(d_path)
|
||||||
|
|
||||||
# 删除 tar.gz (如果脚本生成了)
|
# 删除 tar.gz (如果脚本生成了)
|
||||||
tar_gz = output_path / "pipeline_results.tar.gz"
|
tar_gz = output_path / "pipeline_results.tar.gz"
|
||||||
if tar_gz.exists():
|
if tar_gz.exists():
|
||||||
tar_gz.unlink()
|
tar_gz.unlink()
|
||||||
|
|
||||||
# 移除 stage 目录 (run_single_fna_pipeline 生成的)
|
|
||||||
stage_dir = output_path / "stage"
|
|
||||||
if stage_dir.exists():
|
|
||||||
shutil.rmtree(stage_dir)
|
|
||||||
|
|
||||||
# 验证 Zip 是否生成
|
# 验证 Zip 是否生成
|
||||||
if not zip_path.exists():
|
if not zip_path.exists():
|
||||||
raise Exception("Failed to create result zip file")
|
raise Exception("Failed to create result zip file")
|
||||||
|
|||||||
@@ -11,7 +11,7 @@ WORKDIR /app
|
|||||||
COPY . .
|
COPY . .
|
||||||
|
|
||||||
# Install dependencies
|
# Install dependencies
|
||||||
RUN pixi install
|
RUN pixi install -e digger -e pipeline -e webbackend
|
||||||
|
|
||||||
# Setup external database for BtToxin_Digger
|
# Setup external database for BtToxin_Digger
|
||||||
# Using the copy from tools directory included in COPY . .
|
# Using the copy from tools directory included in COPY . .
|
||||||
|
|||||||
@@ -1,6 +1,4 @@
|
|||||||
{
|
{ "nav": { "home": "Home",
|
||||||
"nav": {
|
|
||||||
"home": "Home",
|
|
||||||
"contact": "Contact",
|
"contact": "Contact",
|
||||||
"tool": "Prediction Tool",
|
"tool": "Prediction Tool",
|
||||||
"docs": "Documentation"
|
"docs": "Documentation"
|
||||||
@@ -111,7 +109,23 @@
|
|||||||
},
|
},
|
||||||
"output": {
|
"output": {
|
||||||
"title": "Output Description",
|
"title": "Output Description",
|
||||||
"desc": "After analysis, you will receive a compressed package containing heatmaps and detailed data, including toxin hit lists, target score matrices, and visualization reports."
|
"desc": "After analysis, you will receive a compressed package containing heatmaps and detailed data, including toxin hit lists, target score matrices, and visualization reports.",
|
||||||
|
"structure": {
|
||||||
|
"title": "Result File Structure",
|
||||||
|
"desc": "The downloaded zip package contains the following directories:"
|
||||||
|
},
|
||||||
|
"files": {
|
||||||
|
"input_dir": "Original Input Files",
|
||||||
|
"mining_dir": "BtToxin_Digger Mining Results",
|
||||||
|
"all_toxins": "Core Result: Predicted Toxin Gene List",
|
||||||
|
"scoring_dir": "Shoter Toxicity Scoring Results",
|
||||||
|
"report": "Comprehensive Analysis Report (Markdown)",
|
||||||
|
"strain_heatmap": "Strain Target Activity Heatmap",
|
||||||
|
"hit_heatmap": "Single Sequence Activity Heatmap",
|
||||||
|
"strain_tsv": "Strain Target Score Raw Data",
|
||||||
|
"toxin_tsv": "Toxin Support Data",
|
||||||
|
"logs": "Execution Logs"
|
||||||
|
}
|
||||||
},
|
},
|
||||||
"note": {
|
"note": {
|
||||||
"title": "About BtToxin_Shoter",
|
"title": "About BtToxin_Shoter",
|
||||||
|
|||||||
@@ -101,7 +101,23 @@
|
|||||||
},
|
},
|
||||||
"output": {
|
"output": {
|
||||||
"title": "输出说明",
|
"title": "输出说明",
|
||||||
"desc": "分析完成后,您将获得一个包含热图和详细数据的压缩包,包括毒素命中列表、靶标评分矩阵和可视化报告。"
|
"desc": "分析完成后,您将获得一个包含热图和详细数据的压缩包,包括毒素命中列表、靶标评分矩阵和可视化报告。",
|
||||||
|
"structure": {
|
||||||
|
"title": "结果文件结构",
|
||||||
|
"desc": "下载的压缩包解压后包含以下目录:"
|
||||||
|
},
|
||||||
|
"files": {
|
||||||
|
"input_dir": "原始输入文件目录",
|
||||||
|
"mining_dir": "BtToxin_Digger 毒素挖掘结果",
|
||||||
|
"all_toxins": "核心结果:预测到的毒素基因列表",
|
||||||
|
"scoring_dir": "Shoter 毒性评分结果",
|
||||||
|
"report": "综合分析报告 (Markdown)",
|
||||||
|
"strain_heatmap": "菌株靶标活性评分热图",
|
||||||
|
"hit_heatmap": "单序列活性评分热图",
|
||||||
|
"strain_tsv": "菌株靶标评分原始数据",
|
||||||
|
"toxin_tsv": "毒素支持度数据",
|
||||||
|
"logs": "运行日志目录"
|
||||||
|
}
|
||||||
},
|
},
|
||||||
"note": {
|
"note": {
|
||||||
"title": "关于 BtToxin_Shoter",
|
"title": "关于 BtToxin_Shoter",
|
||||||
|
|||||||
@@ -62,12 +62,12 @@
|
|||||||
<!-- Progress Bar (only for running) -->
|
<!-- Progress Bar (only for running) -->
|
||||||
<div v-if="taskStatus.status === 'running'" class="progress-section">
|
<div v-if="taskStatus.status === 'running'" class="progress-section">
|
||||||
<el-progress
|
<el-progress
|
||||||
:percentage="simulatedProgress"
|
:percentage="Number(simulatedProgress.toFixed(1))"
|
||||||
:status="getProgressStatus(taskStatus.status)"
|
:status="getProgressStatus(taskStatus.status)"
|
||||||
:stroke-width="20"
|
:stroke-width="20"
|
||||||
:show-text="true"
|
:show-text="true"
|
||||||
/>
|
/>
|
||||||
<span class="progress-text">{{ $t('status.running.progress', { percent: simulatedProgress }) }}</span>
|
<span class="progress-text">{{ $t('status.running.progress', { percent: simulatedProgress.toFixed(1) }) }}</span>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<!-- Estimated Time (only for running) -->
|
<!-- Estimated Time (only for running) -->
|
||||||
|
|||||||
@@ -145,18 +145,19 @@ const { t } = useI18n()
|
|||||||
|
|
||||||
<el-card class="file-tree-card">
|
<el-card class="file-tree-card">
|
||||||
<pre class="file-tree">
|
<pre class="file-tree">
|
||||||
pipeline_results.zip
|
pipeline_results_{id}.zip
|
||||||
├── <span class="folder">digger/</span>
|
├── <span class="folder">Input/</span> # {{ t('tutorial.output.files.input_dir') }}
|
||||||
|
│ └── <span class="file">filename.fna</span>
|
||||||
|
├── <span class="folder">1_Toxin_Mining/</span> # {{ t('tutorial.output.files.mining_dir') }}
|
||||||
│ ├── <span class="folder">Results/Toxins/</span>
|
│ ├── <span class="folder">Results/Toxins/</span>
|
||||||
│ │ └── <span class="file">All_Toxins.txt</span> # {{ t('tutorial.output.files.all_toxins') }}
|
│ │ └── <span class="file">All_Toxins.txt</span> # {{ t('tutorial.output.files.all_toxins') }}
|
||||||
│ └── ...
|
│ └── ...
|
||||||
├── <span class="folder">shotter/</span>
|
├── <span class="folder">2_Toxicity_Scoring/</span> # {{ t('tutorial.output.files.scoring_dir') }}
|
||||||
|
│ ├── <span class="file">shotter_report.md</span> # {{ t('tutorial.output.files.report') }}
|
||||||
│ ├── <span class="file">strain_target_scores.png</span> # {{ t('tutorial.output.files.strain_heatmap') }}
|
│ ├── <span class="file">strain_target_scores.png</span> # {{ t('tutorial.output.files.strain_heatmap') }}
|
||||||
│ ├── <span class="file">per_hit_{id}.png</span> # {{ t('tutorial.output.files.hit_heatmap') }}
|
│ ├── <span class="file">per_hit_{id}.png</span> # {{ t('tutorial.output.files.hit_heatmap') }}
|
||||||
│ ├── <span class="file">shotter_report.md</span> # {{ t('tutorial.output.files.report') }}
|
│ └── ...
|
||||||
│ ├── <span class="file">strain_target_scores.tsv</span> # {{ t('tutorial.output.files.strain_tsv') }}
|
└── <span class="folder">Logs/</span> # {{ t('tutorial.output.files.logs') }}
|
||||||
│ └── <span class="file">toxin_support.tsv</span> # {{ t('tutorial.output.files.toxin_tsv') }}
|
|
||||||
└── <span class="folder">logs/</span> # {{ t('tutorial.output.files.logs') }}
|
|
||||||
</pre>
|
</pre>
|
||||||
</el-card>
|
</el-card>
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
[workspace]
|
[workspace]
|
||||||
name = "bttoxin-pipeline"
|
name = "bttoxin-pipeline"
|
||||||
channels = ["conda-forge", "bioconda", "bioconda/label/cf201901"]
|
channels = ["conda-forge", "bioconda"]
|
||||||
platforms = ["linux-64"]
|
platforms = ["linux-64"]
|
||||||
version = "0.1.0"
|
version = "0.1.0"
|
||||||
channel-priority = "disabled"
|
channel-priority = "disabled"
|
||||||
@@ -8,6 +8,9 @@ channel-priority = "disabled"
|
|||||||
# =========================
|
# =========================
|
||||||
# digger 环境:bioconda 依赖
|
# digger 环境:bioconda 依赖
|
||||||
# =========================
|
# =========================
|
||||||
|
[feature.digger]
|
||||||
|
channels = ["bioconda", "conda-forge", "bioconda/label/cf201901"]
|
||||||
|
|
||||||
[feature.digger.dependencies]
|
[feature.digger.dependencies]
|
||||||
bttoxin_digger = "==1.0.10"
|
bttoxin_digger = "==1.0.10"
|
||||||
perl = "==5.26.2"
|
perl = "==5.26.2"
|
||||||
|
|||||||
Reference in New Issue
Block a user