Skip to content

process_read_data() should not keep retrying #65

@hackerb9

Description

@hackerb9

Currently process_read_data() will keep retrying on EAGAIN until more bytes have been read than the read buffer can contain (hardcoded to 1024).

This is unnecessary because process_read_data is already called in a loop and actually harmful because the program will hang, waiting for data that will never show up.

I fixed it in my fork by making these changes:

@@ -538,10 +682,11 @@ static unsigned char next_count_value(unsigned char c)
 
 static void process_read_data(void)
 {
-	unsigned char rb[1024];
+	const int RBSIZE = 1024;
+	unsigned char rb[RBSIZE];
 	int actual_read_count = 0;
-	while (actual_read_count < 1024) {
-		int c = read(_fd, &rb, sizeof(rb));
+	while (actual_read_count < RBSIZE) {
+		int c = read(_fd, &rb, (RBSIZE-actual_read_count));
 		if (c > 0) {
 			if (_cl_rx_dump) {
 				if (_cl_rx_dump_ascii)
@@ -569,17 +714,13 @@ static void process_read_data(void)
 			}
 			_read_count += c;
 			actual_read_count += c;
-		} else if (errno) {
-			if (errno != EAGAIN) {
-				perror("read failed");
-			}
-			continue; // Retry the read
 		} else {
-		    break;
+			break; // Do not continue! We already loop on reading.
 		}
 	}
 	if (_cl_rx_detailed) {
-		printf("Read %d bytes\n", actual_read_count);
+		printf("Read %d bytes %s\n", actual_read_count,
+		       (actual_read_count < RBSIZE)?"":"(buffer limit)");
 	}
 }

Note that this also fixes a buffer overflow because each read() previously tried to add 1024 more bytes to the 1024 byte buffer. Now, it requests only as many bytes as are available in the buffer.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions