Job: Read PDFs from some manufacturing report, fill out summary data in Excel. 1 row for 1 PDF. Assume there's millions of PDFs.
Person A : Manually reads and fills out each row. 1 row/minute.
Person B: Programs some VBA code with optical character recognition that automates the whole process. 1000 rows/minute.
Person C: The guy that's working on backend GPT4. 1 million rows/minute EZ PZ.
…
What are your opinions for, ideally/realistically, how much persons A, B, C… X should/would be paid?
Perhaps, ideally, people should be paid based on a log scale (min wage for person A, 3x min mage for person B, 6x for person C, etc.)?
Perhaps, realistically, there's no difference in wages b/w person A and B. B just gets extra work. But, person C at a million times faster? He/she will finish the work before lunch. At least the manager's going to notice. What then? Gets moved to IT?
How does this all play out on the larger scale? If there's someone that contributes (to company/society/whatever) millions of times more than another, does this somehow justify wealth disparity? Etc.