Breeze in Busan

Independent journalism on the politics, economy, and society shaping Busan.

Contact channels

News Tips

[email protected]

Partnerships

[email protected]

Contribute

[email protected]

Information

[email protected]

Explore

  • Home
  • Latest News
  • Busan News
  • National News
  • Authors
  • About
  • Editor
  • Contact

Contribute

  • Send News
  • Contact
  • Join Team
  • Collaborate

Legal

  • Privacy Policy
  • Cookie Policy
  • Terms of Use
  • Editorial Policy
  • Correction & Rebuttal

Newsroom Details

30, Hasinbeonyeong-ro 151beon-gil, Saha-gu, Busan, Korea

+82 507-1311-4503

Busan 아00471

Registered: 2022.11.16

Publisher·Editor: Maru Kim

Juvenile Protection: Maru Kim

© 2026 Breeze in Busan. All Rights Reserved.

Independent reporting from Busan across politics, economy, society, and national affairs.

philosophy
Breeze in Busan

The End of Functional Labor

When functional competence becomes abundant, markets reprice labor around the scarce ability to bear risk, authorize outcomes, and justify decisions. In the AI economy, the bottleneck is not production but approval.

Jan 22, 2026
14 min read
Save
Share
Maru Kim

Maru Kim

Editor-in-Chief

Maru Kim, Editor-in-Chief and Publisher, is dedicated to providing insightful and captivating stories that resonate with both local and global audiences.

The End of Functional Labor
Breeze in Busan | When Everyone Can Produce Everything

Generative AI erased the marginal cost of functional expertise. Contract drafting, ledger reconciliation, medical image triage, and policy briefs now appear in seconds, produced with a level of procedural competence that once required years of professional training. Cheap function exposes an uncomfortable architecture beneath modern labor: the scarcity of function served as the valve through which income, prestige, and institutional authority flowed.

Knowledge professions absorbed the initial shock. Junior accountants lost entry-level reconciliation and variance analysis to audit copilots. Associates in corporate law firms watched research and first-pass drafting collapse into automated retrieval and template synthesis. Residency programs in major hospitals confronted diagnostic suggestions from algorithms that outperform median trainees on dermatology and radiology benchmarks. The lower rungs of expert hierarchies did not disappear; they lost bargaining power when their functional contribution ceased to be scarce.

Productivity gains did not liberate workers from procedural burdens. Instead, the productivity gains expanded the volume of outputs—reports, memos, analyses, and drafts—without dissolving the bottlenecks of authorization and responsibility. Organizations increased verification, compliance review, and reputational risk management to compensate for cheap function. The result resembles industrial overproduction: abundant goods accompanied by congested distribution. Automation floods the system with functional outputs; institutions slow the flow at the point of approval.

Labor markets now face a conceptual fracture. Functional competence no longer guarantees value because machines reproduce competence on demand. Value migrates toward the domains machines do not claim: liability, legitimacy, and recognition. Professional work becomes less about performing a task and more about owning its consequences. AI strips labor of function and leaves labor with responsibility.


THE INDUSTRIAL CONCEPT OF WORK

Industrial capitalism treated work as a functional instrument. Factories operationalized labor into discrete tasks, measured output per hour, and priced wages according to marginal productivity. The logic of efficiency reduced human skill to a unit of production: predictable, substitutable, and optimizable. Taylorist management created the template. Fordist assembly lines enforced it. Economic theory later rationalized the template into formal models that portrayed labor as a factor that could be decomposed into function and priced accordingly.

The twentieth century managerial state internalized the same architecture. Universities standardized credentials that certified functional skill. Licensing bodies and professional associations coordinated entry barriers that rationed scarcity. Corporations and public agencies integrated performance metrics that assigned value to measurable function. A social contract emerged from that architecture: if labor produced valuable function, society produced material reward and symbolic recognition.

The concept of functional scarcity shaped the status hierarchy within knowledge-intensive professions. Corporate law firms priced billable hours as if legal reasoning were a scarce commodity. Audit firms derived revenue from expertise in reconciliation and compliance. Hospitals sorted residents through radiology reads, diagnostic differentials, and surgical rotations. Each layer of the hierarchy mirrored the assumption that function—once learned—could not be frictionlessly replicated at scale.

Industrial labor markets also encoded time as an index of functional acquisition. Seniority justified pay because time signified accumulated function. Professional apprenticeships treated functional mastery as a scarce horizon that required years of supervised practice. The promise of upward mobility rested on that scarcity; junior roles absorbed years of low-status work in exchange for eventual access to higher-value function.

That architecture contained a latent fragility. When function is scarce, labor markets can allocate value through credentialing, apprenticeship, and hierarchy. When function becomes abundant, those same institutions lose their rationale. AI did not create a cultural crisis around work; AI revealed the degree to which modern labor theory equated work with function and treated meaning, legitimacy, and responsibility as secondary attributes.

Concept Briefing

Key Terms for a Post-Functional Economy

Core economic and philosophical concepts that structure the shift from work as function to work as responsibility, judgment, and approval.

From Function to Responsibility
Zero marginal cost → functional outputs become abundant.
Functional abundance → verification and approval turn into bottlenecks.
Bottlenecks in approval → responsibility, liability, and legitimacy become scarce.
Scarce responsibility → labor value shifts from producing outputs to owning consequences.
A. Work & Automation
Pseudo-work

Work created by the gap between functional abundance and institutional accountability. Pseudo-work appears as review, annotation, sign-off, and compliance activity required to handle outputs whose necessity is uncertain but whose consequences might become costly.

Pseudo-work signals diligence and protects organizations in audits or litigation, even when it contributes little new substance to decisions.

Bullshit Jobs

Roles that even their holders privately regard as socially useless but that persist because of organizational politics, status signaling, or institutional inertia. The category describes positions, not just tasks.

Pseudo-work can grow inside meaningful jobs; bullshit jobs describe jobs that remain hollow in their entirety.

Artificial Employment

Employment sustained less by productive necessity than by political, social, or macroeconomic goals. Governments or large institutions maintain tasks and roles to preserve income, stability, or legitimacy after automation has weakened the underlying functional need.

In an AI context, artificial employment absorbs workers displaced by automation into functions that protect social order rather than generate new output.

Apprenticeship Collapse

Structural breakdown of the traditional ladder in which juniors perform low-level functional tasks for years before acquiring high-stakes judgment roles. AI removes the low-level tasks while leaving the high-liability decisions in human hands.

Professions keep their peak responsibilities but lose the training ground through which new experts historically emerged.

B. Economy, Risk, and Scarcity
Zero-Marginal-Cost Paradox

Situation in which AI drives the marginal cost of generating functional outputs—text, code, analysis—toward zero, while the cost of verifying, approving, and assuming responsibility for those outputs remains high.

Production becomes cheap; consequence remains expensive. The paradox explains why productivity gains coexist with rising oversight workloads.

Scarcity Shift

Movement of scarcity from functional competence to responsibility, liability, and the authority to decide. As AI makes competence abundant, markets reprice labor around the scarce ability to own and absorb risk.

The scarcity shift underpins the transition from a productivity economy to a responsibility economy.

Responsibility as Labor

Redefinition of labor in which the core value lies not in performing tasks but in bearing the consequences of decisions based on those tasks. Responsibility becomes a priced input because it cannot be automated, duplicated, or infinitely scaled.

In this framing, labor equals the willingness and capacity to answer for outcomes in front of others.

Approval Economy

Configuration in which the decisive act of economic value is the authorization of outcomes rather than the production of intermediate outputs. Contracts, diagnoses, and disclosures only become real when an approved actor signs, orders, or certifies them.

AI can generate drafts; only approved subjects can turn drafts into binding facts.

Trust Capital

Accumulated confidence that institutions, investors, patients, or citizens place in a person or organization to handle errors without collapsing the system. Trust behaves like capital: it grows slowly, concentrates unequally, and can be rapidly depleted by failure.

Trust capital anchors the pricing of responsibility; machines do not yet hold it, humans do.

C. Legitimacy, Subjecthood, and Professional Boundaries
Subjecthood

Status of being recognized as a bearer of responsibility before others. A subject is not just an agent who can act but a person or entity to whom consequences can be legitimately attributed.

AI can perform actions; subjecthood remains with humans and legal entities because only they can be blamed, sued, or forgiven.

Boundary Work

Ongoing effort by professions to draw and redraw the line between tasks that can be delegated to machines and tasks reserved for licensed experts. Boundary work protects jurisdiction, income, and authority when functional scarcity erodes.

In the AI era, boundary work shifts from “who can perform the function” to “who can legitimately authorize and own the outcome.”

Recognition & Legitimacy

Recognition refers to social acceptance of a person as a valid bearer of a role; legitimacy refers to the acceptance of a decision as valid within a shared order. Together they determine who may credibly speak, sign, decide, or represent.

As function cheapens, recognition and legitimacy become the true currencies through which labor translates into status and income.

Together, these concepts describe a labor market in which AI supplies function in excess and humans supply the scarce capacity to authorize, own, and justify outcomes.


AI AND THE DEMOCRATIZATION OF FUNCTION

Generative models collapsed the learning curve of functional competence. Text generation systems emulate legal drafting without passing through law school. Predictive models diagnose dermatological lesions without clinical rotations. Audit copilots identify anomalies in ledgers without accounting apprenticeships. Functional scarcity evaporates when procedural knowledge can be reproduced without training, time, or embodied practice.

Benchmarks illustrate the rupture. Radiology algorithms match or surpass median residents on chest X-ray interpretation while reducing variance across institutions. Dermatology classifiers flag high-risk lesions with sensitivity levels that outperform generalists. Language models achieve passing scores on bar exam sections and produce first-pass contracts with commercially acceptable structure. Accounting copilots reconcile invoices and identify reconciliation mismatches faster than junior auditors who once specialized in precisely that task. The benchmarks obscure differences in accountability and embodied judgment, yet they record a decisive fact: procedural knowledge that once justified apprenticeship can now be performed by machines at negligible marginal cost.

Zero marginal cost shifts the value of functional output from scarcity to saturation. The bottleneck migrates away from production and toward verification. Hospitals accelerate imaging throughput but slow discharge decisions because liability and institutional risk do not automate at the same rate as diagnosis. Law firms accelerate research and drafting but slow partner review because reputational risk remains human. Audit firms accelerate reconciliation but delay sign-off because compliance, litigation exposure, and investor trust remain intact. Organizations accept functional abundance but not functional independence.

The democratization of function undermines the traditional sequencing of professional development. Apprenticeship assumed that novices would perform low-level functional tasks for years, accumulating tacit knowledge before reaching positions of judgment. AI erases the low-level tasks while preserving the high-liability judgments. The ladder remains, but its rungs disappear. Without rungs, the hierarchy cannot reproduce itself through training; the profession cannot reproduce itself through time.

Functional abundance also destabilizes price formation. If machinery supplies the procedural component of legal or medical work, fees can no longer index value to hours or rote labor. Markets confront a paradox: the machinery lowers production cost while institutions maintain high prices because price reflects liability, legitimacy, and approval rather than function. The paradox signals the boundary of automation: machines copy competence easily, but machines do not yet accept blame.


THE RISE OF ARTIFICIAL AND PSEUDO-WORK

Automation multiplies outputs faster than institutions can authorize their consequences. Reports, drafts, synthetic research summaries, compliance notes, and planning documents now appear in such volume that internal review pipelines resemble bureaucratic triage. The cheapening of function does not eliminate work; it generates layers of pseudo-work dedicated to sorting, verifying, annotating, approving, or merely acknowledging outputs whose necessity remains unclear. The machinery accelerates production and the institution compensates by inflating supervision.

Empirical evidence already shows the shift. Surveys from workplace research labs indicate that generative systems flood offices with synthetic memoranda that demand correction, contextualization, or rejection. Managers report declining trust in written artifacts as authorship becomes ambiguous and as fluency ceases to guarantee competence. Legal departments expand compliance review to distinguish model-generated text from human intentions because courts assign liability to intent rather than to syntax. Hospitals integrate second-reader protocols to arbitrate between machine and resident diagnostic suggestions because clinical risk tolerates neither ambiguity nor automation bias. Pseudo-work accumulates at the intersection of abundance and accountability.

The sociology of organizations offers a structural explanation. Bureaucracies treat outputs as evidence of diligence in the absence of transparent outcomes. Automation magnifies that dynamic. Each synthetic draft becomes proof of activity rather than progress. Workers expend time producing supervisory artifacts—clarifications, disclaimers, sign-offs, and contextual notes—that allow institutions to defend actions in audit, litigation, or regulatory scrutiny. Pseudo-work emerges not from laziness but from institutional risk management.

Economic incentives reinforce the behavior. Firms can automate production but cannot automate responsibility, so responsibility becomes the scarce input. When responsibility cannot scale, organizations ration its deployment. Rationing appears as review queues, approval hierarchies, and layered compliance. The institution exhibits productivity gains on paper yet consumes those gains internally through verification, a pattern that resembles industrial overcapacity absorbing its own surplus.

The proliferation of pseudo-work reveals a deeper dependence on social recognition. Functional outputs lose value when anyone—or anything—can produce them. Social systems then demand signals that differentiate serious work from stochastic ornament. Workers respond by embedding intent, judgment, and authorship into the artifact, often through explanatory notes or contextual framing. The artifact becomes less a vehicle of function than a performance of legitimacy.


PROFESSIONAL DEFENSE MECHANISMS

Knowledge professions responded to the collapse of functional scarcity by retreating toward zones of legitimacy. The retreat does not reflect nostalgia for pre-automation labor; it reflects an institutional instinct for self-preservation. Specialized occupations defended their relevance by securitizing responsibility, jurisdiction, and judgment—domains that resist machine substitution because they anchor liability, trust, and collective approval.

Regulatory bodies moved first. Bar associations, medical boards, and accounting overseers strengthened licensing thresholds and clarified rules concerning sign-off authority. The regulatory emphasis shifted from procedural competence to accountability. A contract may be drafted by an algorithm, but only a licensed attorney can attach intention that binds parties under law. A diagnostic suggestion may be produced by a classifier, but only a physician can assign risk to a body and face consequence for misjudgment. A reconciled balance sheet may come from audit software, but only a certified accountant can bear the exposure that investors and courts recognize. Regulation protects legitimacy rather than function.

Professional firms followed a similar pattern. Law firms concentrated authority in partner review, hospitals consolidated surgical decisions in attending physicians, and audit firms tightened partner-level oversight on attestation. Machine assistance accelerated intermediate tasks but elongated sign-off. The imbalance reflects a new bottleneck. Institutional risk flows through individuals who possess the legal and reputational capacity to absorb consequence. When machines remove cost from production, human bottlenecks acquire price-setting power.

Credential markets adapted by reframing expertise as a capacity for judgment rather than as a capacity for production. Elite universities emphasized “critical reasoning,” “interpretation,” and “leadership” as functional tasks dissolved under automation. Coursework and professional training downplayed procedural skill because procedural skill degraded into commodity form. The curriculum did not prepare students to compete with machinery; the curriculum prepared students to occupy roles that authorize machinery.

The defense also appeared in narrative form. Professional discourse recast expert work as a guardian of standards, ethics, and public interest. Physicians emphasized bedside judgment and duty of care. Lawyers invoked constitutional reasoning and adversarial fairness. Accountants foregrounded investor protection and systemic trust. The narratives aligned with the emerging scarcity: machines supply competence, but institutions demand legitimacy. The stories did not merely justify the professions; they recalibrated the basis of rent extraction.

The defense reveals a blunt economic fact. Automation destroys scarcity at the point of function, so value migrates to the point of consequence. Professions survived the shock not by outperforming machines in task execution but by monopolizing the right to assume liability.


WHEN FUNCTION CEASES TO BE SCARCE

Labor markets assign value to what cannot be automated. AI altered that calculus by converting functional competence into a commodity with no supply constraint. Once functional tasks lose scarcity, price no longer indexes skill; price indexes exposure, liability, and the authority to decide. The scarcity migrates from production to consequence.

Pricing evidence confirms the movement. Law firms do not discount partner review even when drafting accelerates, because courts attach liability to intent rather than to text. Hospitals do not reduce fees for attending physicians when AI accelerates triage, because malpractice attaches to judgment rather than to diagnosis. Audit firms do not lower partner-level billing when reconciliation becomes automatic, because investor trust attaches to attestation rather than to arithmetic. Price formation follows the scarce input—judgment under uncertainty—not the abundant substrate of computation.

Scarcity also shifts into timelines. Automated function compresses production time but leaves decision time intact. Institutions cannot accelerate consequence arbitrarily because legal and reputational exposure demand careful pacing. The asymmetry creates economic rents at the decision boundary. The actor who owns the final decision point controls the bottleneck and therefore controls the margin. In knowledge industries, partners, attendings, and senior signatories occupy that boundary.

Credential markets adjust accordingly. Universities and licensing bodies no longer claim to teach procedures that machines already perform; they claim to cultivate interpretive capacity, ethical reasoning, strategic discernment, and leadership. The marketing language appears ornamental, yet the adjustment reflects a deeper economic recognition: scarcity resides in the right to authorize and the right to refuse. An automated draft has no value until an authorized actor converts it into an institutionally recognized act.

Scarcity also migrates into reputation. Automation does not produce trust; automation produces output. Trust accumulates through history, accountability, and demonstrated willingness to bear consequence. Institutions treat trust as capital in the strict sense: a stock that can be drawn down when decisions go wrong. Machines do not hold such capital. Human actors hold it to the extent that firms, courts, and clients believe they can absorb blame without destabilizing the system.

The scarcity shift reveals a structural misalignment between technological capability and institutional architecture. AI lowers the cost of competence, yet institutions continue to price responsibility. The two forces do not converge because competence and responsibility belong to different ontologies. Competence solves problems; responsibility allocates risk. Markets now price risk rather than skill, and labor markets reward those who can shoulder risk with sufficient legitimacy to satisfy courts, regulators, and clients.

The shift exposes the limits of traditional productivity metrics. The metrics assumed that productivity derived from function, and that labor value could be measured through output volume or speed. Automation breaks that equivalence. Output becomes too cheap to meter, so value migrates to bottlenecks where consequence coagulates. Productivity no longer means producing more; productivity means absorbing more exposure without collapsing trust.


RESPONSIBILITY, JUDGMENT, AND APPROVAL

Responsibility surfaces as the primary axis of value because responsibility absorbs uncertainty. Institutions cannot remove uncertainty from medical risk, contractual conflict, or financial disclosure; they can only assign the burden of uncertainty to actors capable of bearing consequence. AI reduces uncertainty in diagnosis or drafting, yet liability remains anchored in humans because courts and regulators do not recognize software as a moral or legal subject. Responsibility becomes labor precisely because responsibility can fail.

Judgment occupies the second axis. Machines handle deterministic tasks with high reliability, but complex disputes seldom resolve into deterministic form. Legal negotiation, clinical triage, and corporate strategy require navigation through ambiguous evidence, contested interests, and future consequences that cannot be statistically previewed. Judgment does not merely select the optimal option; judgment persuades others that the decision satisfies institutional and moral standards. Economic value accrues to judgment because collective decision-making cannot be automated without consensus about what counts as a correct future.

Approval forms the institutional layer that stabilizes responsibility and judgment. Organizations demand approval to convert outputs into legitimate actions. Approval can derive from regulatory sanction, professional authority, or social recognition. A contract requires approval to bind; a medical order requires approval to treat; a financial statement requires approval to disclose. Approval functions not as ornamentation but as the hinge between action and consequence. AI cannot grant approval because approval presupposes standing within a legal or moral community.

Identity intersects with these foundations. In industrial economies, identity served as a psychological accessory to work; in a post-functional economy, identity becomes part of the mechanism through which legitimacy is conferred. Clients trust lawyers who embody adversarial loyalty, patients trust physicians who embody duty of care, investors trust auditors who embody prudence. Trust attaches to roles that signify responsibility, not to skill that can be cloned. Identity becomes an instrument of allocation.

Meaning re-enters not as sentiment but as coordination. Automation fragments functional tasks into cheap outputs that require reaggregation. Workers supply meaning by interpreting outputs, prioritizing actions, and negotiating trade-offs. Meaning assigns direction to abundance. Without meaning, abundance produces paralysis. Meaning becomes operational rather than romantic; meaning organizes systems that cannot stabilize themselves through computation alone.

Care persists because no institution can automate the reciprocal recognition embedded in human vulnerability. Hospitals can automate diagnostics, but recovery still depends on relational attention. Courts can automate research, but justice still depends on acknowledging dignity and harm. Families can automate scheduling, but affection remains irreducible. Care does not derive value from scarcity alone; care derives value from irreplaceability.

The new foundations of work—responsibility, judgment, approval, identity, meaning, and care—do not replace function; they metabolize function. AI supplies function in excess; society demands decisions that transform function into consequence. Labor shifts from producing outputs to authorizing outcomes. The shift does not diminish work; the shift alters what counts as work.


VALUE AFTER FUNCTION

AI dismantles the premise that labor derives value from function. Function loses scarcity, and scarcity migrates to responsibility. Markets behave accordingly. Prices drift toward the actors who absorb risk rather than toward the actors who perform tasks. Economic value attaches to consequence because consequence cannot be automated without a subject capable of being blamed. The subject becomes the locus of value.

Responsibility acquires the characteristics of a scarce asset. Liability cannot scale, cannot be cloned, and cannot be executed without standing in a legal and moral order. Institutional trust accumulates in individuals and firms not because they produce function but because they can withstand the failure of function. AI converts competence into a commodity; society converts consequence into labor. The two movements do not cancel each other. They define the new division of work.

Approval reinforces the mechanism. Approval transforms outputs into actions by granting them legitimacy. Legitimacy decides which consequences bind and which consequences dissipate. Machines produce outputs; institutions produce legitimacy. The asymmetry explains why abundance in function coexists with scarcity in judgment. A saturated field of competence still requires a gate through which consequences must pass.

Economic theory can interpret the entire transformation through the grammar of scarcity. When function ceases to be scarce, value relocates to the boundary where action becomes responsibility. Responsibility becomes labor because responsibility incurs cost, generates risk, and demands trust. Trust resembles capital; it accumulates unevenly and dissipates under stress. Trust is held by actors capable of social recognition. Recognition, in turn, depends on identity. The chain connects price to personhood.

The transformation carries a philosophical implication. If work no longer establishes value by producing useful function, work establishes value by authorizing outcomes in a shared world. Authorization links labor to the collective, because only the collective can approve the consequences of action. Labor becomes a site of recognition rather than a site of production. Recognition does not merely comfort the individual; recognition stabilizes the social order by allocating consequence to an identifiable subject.

AI automates function but cannot automate subjecthood. Subjecthood cannot be coded because subjecthood is defined by the capacity to bear responsibility before others. Markets demand subjects because markets cannot settle consequences without them. Institutions demand subjects because institutions cannot assign blame or legitimacy to software. Philosophy demands subjects because philosophy cannot think action without agency.

The economy of function yields to an economy of approval. The price of labor reflects not what the worker can do but what the worker can answer for. In that arrangement, the question of labor becomes a question of existence: who counts, who decides, and who bears the weight of consequences that no machine will claim.

The Weekly Breeze

Keep pace with Busan's deep narratives.
Delivered every Monday morning.

Independent journalism, directly to your inbox.

Strategic Partner
Breeze Editorial
Elevate Your
Brand's Narrative

Connect your core values with a community of
thoughtful and discerning readers.

Inquire Now
Related Topics
Philosophy

Share This Story

Knowledge is most valuable when shared with the community.

Next Article
Venezuela, Greenland, and the Moment Law Fell Behind Power

💬 Comments

Please sign in to leave a comment.

    Related Coverage

    Continue with related reporting

    Follow adjacent reporting from the same newsroom file, with linked coverage that extends the current story's desk and context.

    Venezuela, Greenland, and the Moment Law Fell Behind Power
    Jan 7, 2026

    Venezuela, Greenland, and the Moment Law Fell Behind Power

    The postwar order was built on a simple expectation: force would wait for law. In 2026, that expectation no longer holds.

    The Age of Outsourced Thinking
    Nov 21, 2025

    The Age of Outsourced Thinking

    AI is not merely accelerating learning; it is reshaping the ecology of cognition. And without a philosophical anchor, education risks surrendering the very capacities that make us human.

    Words That Wound: The Violence of Hate Speech in Korea’s New Street Protests
    Sep 12, 2025

    Words That Wound: The Violence of Hate Speech in Korea’s New Street Protests

    The hate rallies in Seoul’s Myeongdong district do not resemble the candlelight vigils that defined Korean democracy. Instead, they echo repertoires seen in Japan and the US.

    More from the author

    Continue with the author

    Stay with the same line of reporting through more work from this byline.

    Who Learns From War
    Mar 5, 2026

    Who Learns From War

    Can South Korea Prevent AI From Becoming an Elite Monopoly?
    Feb 25, 2026

    Can South Korea Prevent AI From Becoming an Elite Monopoly?