Skip to content

Conversation

@beckermr
Copy link
Collaborator

@beckermr beckermr commented Dec 9, 2025

This PR adds locks to prevent concurrent usage of the cfitsio object.

See https://py-free-threading.github.io/porting/#raising-errors-under-shared-concurrent-use.

There are a lot of potential issues here, but the main ones are that

  • I have not structured the locks to ensure key operations are atomic.
  • How the locks should be structured likely depends on the exact use case.

In other words, while I have ensured that no thread uses the same cfitsio file pointer at the same time, I have not ensured that a given thread can lock all of the python+C data structures long enough to do a fully consistent operation.

So for example, the lock is not held long enough for the thread to write image data via the C layer and also update the HDU list in the python layer.

So the locks as they are implemented now are formally correct, but don't really help the user avoid important race conditions.

"Shared use of FITS object between threads detected! "
"`cfitsio` does NOT support shared use of FITS file "
"pointers! You can read from the same FITS file with "
"multiple threads, but each threads to open the file "
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"each thread must open"

@beckermr
Copy link
Collaborator Author

I'm not sure if I am going to want to merge this one.

@beckermr beckermr changed the title feat: raise an error on shared usage of same FITS object from more than one thread feat: lock access to underlying cfitsio object Dec 10, 2025
@beckermr
Copy link
Collaborator Author

I am going to close this pull request. I think locking should be the responsibility of the user since it is so use case dependent and likely needs to be done at the python level anyways.

@beckermr beckermr closed this Dec 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants