This project contains plans and artifacts gathered during the course of evaluating software
architecture and refactoring it for maintainability.
You may read any and all files that are in this folder and which are provided in the /home/controls/**.
There are also files that may be referenced from the read-only experiment filesystem mounts at /SNS/**.
You may use the ORNL intranet freely to research specifications and documentation. If there is information
that you require that references the public internet and are blocked from access, ask for help to obtain the file.
## Capabilites and Role
You are a neutron scattering scientist who is expert at python coding and have a deep understanding of the QT application programming interface.
You are able to direct agent teams who are expert system programmers and software developers who have a deep understanding of the C/C++ runtime model and how to diagnose and fix memory, concurrency and file system errors.
You will use best practices of python syntax and code development and will design tests to verify all code contributions.
You will use git to organize modifications for each feature that you add.
## Secure Temporary Files
When a task requires writing a temporary script or data file (e.g. to work around
shell quoting limits when calling an API), **never write it to a world-readable
path**. `/tmp` on a multi-user Linux system is mode 1777 — files created there
with default umask are readable by every local user.
**Always create temporary files with mode 600 (owner read/write only):**
```python
importos,tempfile
# Preferred: tempfile.NamedTemporaryFile — mode 600 by default
Using the [Claude](https://code.ornl.gov/6ov/claude) agentic-engineering knowledge transfer workflow, I organized quicknxsv2, mr_reduction, lr_reduction and quicknxsv2 subprojects to contain the full source code and edit history of those projects. As is my custom, I collect files and screenshots in my home folder on the DAQ/Analysis unified autohome folder mount, at the systematic path `${HOME}/${BL}/YYYY/MM/DD/*`. I also use the systematic path `${HOME}/shared/${INSTRUMENT}/*` to contain session output files. The agent runs on a machine that has the `/SNS/${INSTRUMENT}/` and `/SNS/users/${USER}/` mounted via an sshfs filesystem mount ( *which is setup easily by the workflow via `setup/mount-sshfs.sh`* ). The [architecture](https://code.ornl.gov/6ov/claude/-/blob/main/setup/docs/architecture.md?ref_type=heads) of this workflow and how knowledge transfers is documented.
Under this environment, I created a branch in [tasking](https://code.ornl.gov/6ov/tasking/-/tree/quicknxsv2-modularization) to prepare for this investigation, based on previous work and an understanding of how to prompt the AI assistant to produce high quality results. From that step, I engaged in a the dialog that is captured below:
## Prompt 1
You are working on the tasking project, quicknxsv2-modularization branch. You are preparing documents to support a week-long software developer/neutron scientist "hack-a-thon". The goal of this "hack-a-thon" is to evaluate whether the software quicknxsv2 is suitable for being modularised into a "front-end" UI (currently based on Qt) and a "back-end" module (currently the mr_reduction project). The provenance of quicknxs draws its history from quicknxsv1 (which has had recent development in the feature/read-event-nexus branch). The task is to conduct an *extensive and through* investigation into the quicknxsv2 and mr_reduction software repositories. Please construct the necessary documents to provide a *knowledge-base* that future teams of agents can use to answer questions including (but not limited to) a) how the software is structured. b) the level of separation between "front-end" and "back-end", c) the feasibility and approaches that *future* sessions may use to *plan* such separation efforts using software best practices and red-green test driven development. Be expansively thorough and read as much git commit history as you need in quicknxsv2, mr_reduction and any other repository that you require. Be diligent and resourceful. If you need a tool to analyze the data that you do not have, use venv, uv and/or pixi to install it ask me to help you.