diff --git a/doc/specific_iocs/dae/Datastreaming.md b/doc/specific_iocs/dae/Datastreaming.md
index 0c9eb84fe..5662c9d06 100644
--- a/doc/specific_iocs/dae/Datastreaming.md
+++ b/doc/specific_iocs/dae/Datastreaming.md
@@ -1,4 +1,4 @@
-# Datastreaming
+# Data Streaming
```{toctree}
:glob:
@@ -9,26 +9,21 @@
datastreaming/*
```
-The datastreaming system is being built as part of in-kind work to ESS. It will be the system that the ESS uses to take data and write it to file - basically their equivalent to the [ICP](/specific_iocs/DAE-and-the-ICP). The system may also replace the ICP at ISIS in the future.
+The data streaming system is being built as a requirement for HRPD-X and possibly SANDALS-II, separate (and complementary) to the `MNeuData` project. It is architecturally similar to the system that the ESS uses to take data (neutron events, sample environment, and anything else that we can throw into a streaming platform) and write it to file. Previously ISIS aided development to the ESS' streaming pipeline as part of an in-kind project. The system will replace the ICP at ISIS.
-In general the system works by passing both neutron and SE data into [Kafka](https://kafka.apache.org/) and having clients that either view data live (like Mantid) or write the data to file, additional information can be found [here](http://accelconf.web.cern.ch/AccelConf/icalepcs2017/papers/tupha029.pdf) and [here](https://iopscience.iop.org/article/10.1088/1742-6596/1021/1/012013).
+In general this works by producing both neutron events and histograms, sample environment data, and other diagnostic data into a [Kafka](https://kafka.apache.org/) cluster and having clients (consumers in Kafka lingo!) that either view data live and act on it or write the data to a nexus file. Additional information can be found [here](http://accelconf.web.cern.ch/AccelConf/icalepcs2017/papers/tupha029.pdf) and [here](https://iopscience.iop.org/article/10.1088/1742-6596/1021/1/012013).
-All data is passed into flatbuffers using [these schemas](https://github.com/ess-dmsc/streaming-data-types) - we have a tool called [saluki](https://github.com/ISISComputingGroup/saluki) which can deserialise these and make them human-readable after they've been put into Kafka.
+All data is serialised into [Flatbuffers](https://flatbuffers.dev/) blobs using [these schemas](https://github.com/ess-dmsc/streaming-data-types) - we have a tool called [saluki](https://github.com/ISISComputingGroup/saluki) which can deserialise these and make them human-readable after they've been put into Kafka.
-The datastreaming layout proposed looks something like this, not including the Mantid steps or anything before event data is collected:
+Overall architecture is still being decided, but this is an initial idea of how it could look:
-
+
-## Datastreaming at ISIS
-
-Part of our in-kind contribution to datastreaming is to test the system in production at ISIS. Currently it is being tested in the following way, with explanations of each component below:
-
-
{#kafkacluster}
## The Kafka Cluster
-There is a Kafka cluster at `livedata.isis.cclrc.ac.uk`. Port 31092 is used for the primary Kafka broker.
+There is a (non-production!) Kafka cluster at `livedata.isis.cclrc.ac.uk:31092`.
A web interface is available [here](https://reduce.isis.cclrc.ac.uk/redpanda-console/overview).
:::{important}
@@ -37,71 +32,19 @@ Automation team. See `\\isis\shares\ISIS_Experiment_Controls\On Call\autoreducti
support information.
:::
-### I want my own local instance of Kafka
-
-See {ref}`localredpanda`
-
-## Neutron Data
-
-The ICP on any instrument that is running in full event mode and with a DAE3 may stream neutron events into Kafka.
-
-This is controlled using flags in the `isisicp.properties` file:
-
-```
-isisicp.kafkastream = true
-# if not specified, topicprefix will default to instrument name in code
-isisicp.kafkastream.topicprefix =
-# FIA team run their kafka cluster on port 31092, not 9092
-isisicp.kafkastream.broker = livedata.isis.cclrc.ac.uk:31092
-isisicp.kafkastream.topic.suffix.runinfo = _runInfo
-isisicp.kafkastream.topic.suffix.sampleenv = _sampleEnv
-isisicp.kafkastream.topic.suffix.alarms = _alarms
-```
-
-In the same file, you will also need to ensure the following properties are set:
-
-```
-isisicp.incrementaleventnexus = true
-
-# Event rate, can adjust up or down
-isisicp.simulation.neventssim = 5000
+## How to/FAQs
+See {ref}`datastreaminghowto`
-# Ensure simulated data is switched on
-isisicp.simulation.simulatedata = true
-isisicp.simulation.simulatespec0 = true
-isisicp.simulation.simulatebin0 = true
-isisicp.simulation.spreadsimevents = true
-```
-
-You additionally need to ensure you are running in event mode. You can do this using the DAE tables `wiring_event_ibextest.dat`, `detector_ibextest.dat` & `spectra_ibextest.dat`. Copies of these tables can be found at:
-
-```
-\\isis\shares\ISIS_Experiment_Controls\event_mode_tables
-```
+## Run starts/stops
+See {ref}`dsrunstartstops`
## SE Data
See [Forwarding Sample Environment](datastreaming/Datastreaming---Sample-Environment)
+## Neutron events and histograms
+See {ref}`dseventshistos`
+
## Filewriting
See [File writing](datastreaming/Datastreaming---File-writing)
-
-## System Tests
-
-:::{note}
-These tests are not currently enabled.
-:::
-
-Currently system tests are being run to confirm that the start/stop run and event data messages are being sent into
-Kafka and that a Nexus file is being written with these events. The Kafka cluster and filewriter are being run in docker
-containers for these tests and so must be run on a Windows 10 machine. To run these tests you will need to
-install [docker for windows and add yourself as a docker-user](https://docs.docker.com/docker-for-windows/install/#install-docker-desktop-on-windows).
-
-## The future of streaming at ISIS
-
-After the in-kind work finishes and during the handover, there are some proposed changes that affect the layout and
-integration of data streaming at ISIS. This diagram is subject to change, but shows a brief overview of what the future
-system might look like:
-
-
diff --git a/doc/specific_iocs/dae/ESSDSLayout.png b/doc/specific_iocs/dae/ESSDSLayout.png
deleted file mode 100644
index c8206fd47..000000000
Binary files a/doc/specific_iocs/dae/ESSDSLayout.png and /dev/null differ
diff --git a/doc/specific_iocs/dae/ESSDSLayout.xml b/doc/specific_iocs/dae/ESSDSLayout.xml
deleted file mode 100644
index 100bbb424..000000000
--- a/doc/specific_iocs/dae/ESSDSLayout.xml
+++ /dev/null
@@ -1,109 +0,0 @@
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
diff --git a/doc/specific_iocs/dae/FUTUREISISDSLayout.png b/doc/specific_iocs/dae/FUTUREISISDSLayout.png
deleted file mode 100644
index fbf049db2..000000000
Binary files a/doc/specific_iocs/dae/FUTUREISISDSLayout.png and /dev/null differ
diff --git a/doc/specific_iocs/dae/FUTUREISISDSLayout.xml b/doc/specific_iocs/dae/FUTUREISISDSLayout.xml
deleted file mode 100644
index d1b3ef4ef..000000000
--- a/doc/specific_iocs/dae/FUTUREISISDSLayout.xml
+++ /dev/null
@@ -1,128 +0,0 @@
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
diff --git a/doc/specific_iocs/dae/ISISDSLayout.drawio.svg b/doc/specific_iocs/dae/ISISDSLayout.drawio.svg
new file mode 100644
index 000000000..6347821cd
--- /dev/null
+++ b/doc/specific_iocs/dae/ISISDSLayout.drawio.svg
@@ -0,0 +1,4 @@
+
+
+
+
\ No newline at end of file
diff --git a/doc/specific_iocs/dae/ISISDSLayout.png b/doc/specific_iocs/dae/ISISDSLayout.png
deleted file mode 100644
index c917f67e2..000000000
Binary files a/doc/specific_iocs/dae/ISISDSLayout.png and /dev/null differ
diff --git a/doc/specific_iocs/dae/ISISDSLayout.xml b/doc/specific_iocs/dae/ISISDSLayout.xml
index 373ec0cd4..78d0f1048 100644
--- a/doc/specific_iocs/dae/ISISDSLayout.xml
+++ b/doc/specific_iocs/dae/ISISDSLayout.xml
@@ -1,146 +1,206 @@
-
-
-
-
+
+
+
-
-
-
-
-
+
+
+
+
+
+
+
-
+
-
+
-
-
-
+
+
+
-
-
+
+
-
+
-
+
-
-
-
+
+
+
-
-
-
-
-
-
-
+
-
+
-
-
+
+
-
-
-
+
+
+
+
+
+
-
-
+
+
-
+
+
+
+
+
+
+
+
+
-
-
+
+
-
+
-
-
-
-
+
-
-
-
-
+
+
-
-
+
+
-
+
-
-
+
+
-
-
+
+
-
-
+
+
-
+
+
+
+
-
-
+
+
+
+
+
+
+
+
+
+
+
-
+
+
+
+
+
+
+
+
+
+
-
-
+
-
-
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
-
-
+
+
-
-
+
+
-
-
+
+
-
+
-
-
+
+
+
-
-
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/doc/specific_iocs/dae/datastreaming/Datastreaming---File-writing.md b/doc/specific_iocs/dae/datastreaming/Datastreaming---File-writing.md
index 50de131e4..0a3794e94 100644
--- a/doc/specific_iocs/dae/datastreaming/Datastreaming---File-writing.md
+++ b/doc/specific_iocs/dae/datastreaming/Datastreaming---File-writing.md
@@ -2,17 +2,6 @@
The [filewriter](https://github.com/ess-dmsc/kafka-to-nexus) is responsible for taking the neutron and SE data out of Kafka and writing it to a nexus file. When the ICP ends a run it sends a config message to the filewriter, via Kafka, to tell it to start writing to file.
-There is also a [filewriter written for the SuperMuSR project](https://github.com/STFC-ICD-Research-and-Design/supermusr-data-pipeline/tree/main/nexus-writer) which we may choose to use.
+There is also a [filewriter written for the SuperMuSR project](https://github.com/STFC-ICD-Research-and-Design/supermusr-data-pipeline/tree/main/nexus-writer) which we may choose to use. This will be decided in [this ticket](https://github.com/ISISComputingGroup/DataStreaming/issues/2)
We are currently figuring out topology on how to run this, ie one-per-instrument or a central one. For now it is not deployed or running anywhere.
-
-#### Adding ISIS data to the filewriter configuration
-To add static data to the filewriter configuration without directly modifying the ICP's output to the `runInfo` topics a script will be used. Things like instrument name and other fields that do not change between instruments can be added here but there are a few gaps that will need to be streamed:
-- Stuff in root of file - things like inst name that can be derived from topic are ok, things that cannot be, like experiment identifier, DAE modes etc
-- Events in `detector1_events` - currently not being forwarded
-- Sample environment is tricky - we need to know what blocks to put in the file template, it's not as simple as just going "anything with the PV prefix of IN:ZOOM" although we could add to the script to look at the forwarder status and check in the currently forwarded PVs
-- Fields derived from detector events such as `total_counts`
-
-The general structure of the file can be written as this will likely not differ between instruments (at least not much) so this will be added in by the script that forwards to `ALL_runInfo`
-
-NB. I couldn't use the NeXus-Constructor for this as it no longer takes a NeXus file as an input, the version on master doesn't allow top-level fields or arbitrary groups, and there aren't many things in the ZOOM file for example that are in `/raw_data_1/instrument` which is where the NeXus constructor puts components by default. Because of events also being stored in the entry (`raw_data_1`), the NeXus-Constructor crashes when trying to output to a JSON file as it tries to write the events out which cannot be worked around unless you modify the source code to ignore that particular group. Even with this done the constructor is still quite unresponsive because of the amount of data in the in-memory NeXus file.
diff --git a/doc/specific_iocs/dae/datastreaming/Datastreaming--neutron-events-histograms.md b/doc/specific_iocs/dae/datastreaming/Datastreaming--neutron-events-histograms.md
new file mode 100644
index 000000000..d92ea0e5b
--- /dev/null
+++ b/doc/specific_iocs/dae/datastreaming/Datastreaming--neutron-events-histograms.md
@@ -0,0 +1,13 @@
+{#dseventshistos}
+# Data streaming: Neutron events and histograms
+
+## For DAE2/DAE3 instruments
+The ICP (communicated to via the ISISDAE IOC) is responsible for communicating with the DAE2/DAE3 in terms of setting configuration, as well as streaming events and histograms from both.
+
+
+## For new instruments using FPGA-based acquisition electronics
+`borzoi` is responsible for communicating with the electronics and sending run starts/stops. It will have a similar interface to `ISISDAE` so we can drop-in replace it in the GUI.(?)
+
+
+## Live view, spectra plots etc.
+These will be provided by a soft IOC (`azawakh`) which effectively consumes from event and histogram topics (and possibly run starts?) which will serve areaDetector and other PVs.
diff --git a/doc/specific_iocs/dae/datastreaming/Datastreaming-How-To.md b/doc/specific_iocs/dae/datastreaming/Datastreaming-How-To.md
index 1b3299252..c79b1d711 100644
--- a/doc/specific_iocs/dae/datastreaming/Datastreaming-How-To.md
+++ b/doc/specific_iocs/dae/datastreaming/Datastreaming-How-To.md
@@ -1,3 +1,4 @@
+{#datastreaminghowto}
# Data streaming how-to guide
This is a guide for basic operations using either the development or production Kafka clusters we use for data streaming at ISIS.
@@ -24,3 +25,41 @@ This can be done through Redpanda console or via a Kafka API call.
## Run my own instance of Kafka/Redpanda
This is done easily by running [this](https://docs.redpanda.com/redpanda-labs/docker-compose/single-broker/#run-the-lab) `docker-compose` file.
+
+
+## Stream event data from the ISISICP
+The ICP on any instrument that is running in full event mode and with a DAE3 may stream neutron events into Kafka. This can also be done in simulation mode.
+
+This is controlled using flags in the `isisicp.properties` file:
+
+```
+isisicp.kafkastream = true
+# if not specified, topicprefix will default to instrument name in code
+isisicp.kafkastream.topicprefix =
+# FIA team run their kafka cluster on port 31092, not 9092
+isisicp.kafkastream.broker = livedata.isis.cclrc.ac.uk:31092
+isisicp.kafkastream.topic.suffix.runinfo = _runInfo
+isisicp.kafkastream.topic.suffix.sampleenv = _sampleEnv
+isisicp.kafkastream.topic.suffix.alarms = _alarms
+```
+
+In the same file, you will also need to ensure the following properties are set:
+
+```
+isisicp.incrementaleventnexus = true
+
+# Event rate, can adjust up or down
+isisicp.simulation.neventssim = 5000
+
+# Ensure simulated data is switched on
+isisicp.simulation.simulatedata = true
+isisicp.simulation.simulatespec0 = true
+isisicp.simulation.simulatebin0 = true
+isisicp.simulation.spreadsimevents = true
+```
+
+You additionally need to ensure you are running in event mode. You can do this using the DAE tables `wiring_event_ibextest.dat`, `detector_ibextest.dat` & `spectra_ibextest.dat`. Copies of these tables can be found at:
+
+```
+\\isis\shares\ISIS_Experiment_Controls\event_mode_tables
+```
\ No newline at end of file
diff --git a/doc/specific_iocs/dae/datastreaming/Datastreaming-run-starts-stops.md b/doc/specific_iocs/dae/datastreaming/Datastreaming-run-starts-stops.md
new file mode 100644
index 000000000..009f2f774
--- /dev/null
+++ b/doc/specific_iocs/dae/datastreaming/Datastreaming-run-starts-stops.md
@@ -0,0 +1,5 @@
+{#dsrunstartstops}
+# Data streaming: run starts/stops
+
+Run starts and stops will be dealt with by `borzoi` and the flatbuffers will be constructed in this process. It may need to be hooked onto by `ISISDAE` for older instruments using DAE2/DAE3 and the ISISICP.
+