NewYorkUniversity
LawReview

Issues

Topic

Legal Theory

Results

Trademark Litigation as Consumer Conflict

Michael Grynberg

Trademark litigation typically unfolds as a battle between competing sellers who argue over whether the defendant’s conduct is likely to confuse consumers. This is an unfair fight. In the traditional narrative, the plaintiff defends her trademark while simultaneously protecting consumers at risk for confusion. The defendant, relatively speaking, stands alone. The resulting “two-against-one” storyline gives short shrift to the interests of nonconfused consumers who may have a stake in the defendant’s conduct. As a result, courts are too receptive to nontraditional trade- mark claims where the case for consumer harm is questionable. Better outcomes are available by appreciating trademark litigation’s parallel status as a conflict between consumers. This view treats junior and senior trademark users as proxies for different consumer classes and recognizes that remedying likely confusion among one group of consumers may cause harm to others. Focusing on the interests of benefited and harmed consumers also minimizes the excessive weight given to moral rhetoric in adjudicating trademark cases. Consideration of trademark’s consumer-conflict dimension is therefore a useful device for critiquing trademark’s expansion and assessing future doctrinal developments.

Two and Twenty: Taxing Partnership Profits in Private Equity Funds

Victor Fleischer

Private equity fund managers take a share of the profits of the partnership as the equity portion of their compensation. The tax rules for compensating general partners create a planning opportunity for managers who receive the industry standard “two and twenty” (a two percent management fee and twenty percent profits interest). By taking a portion of their pay in the form of partnership profits, fund managers defer income derived from their labor efforts and convert it from ordinary income into long-term capital gain. This quirk in the tax law allows some of the richest workers in the country to pay tax on their labor income at a low rate. Changes in the investment world—the growth of private equity funds, the adoption of portable alpha strategies by institutional investors, and aggressive tax planning—suggest that reconsideration of the partnership profits puzzle is overdue.

While there is ample room for disagreement about the scope and mechanics of the reform alternatives, this Article establishes that the status quo is an untenable position as a matter of tax policy. Among the various alternatives, perhaps the best starting point is a baseline rule that would treat carried interest distributions as ordinary income. Alternatively, Congress could adopt a more complex “Cost-of-Capital Method” that would convert a portion of carried interest into ordinary income on an annual basis, or Congress could allow fund managers to elect into either the ordinary income or “Cost-of-Capital Method.” While this Article suggests that treating distributions as ordinary income may be the best, most flexible approach, any of these alternatives would be superior to the status quo. These alternatives would tax carried interest distributions to fund managers in a manner that more closely matches how our tax system treats other forms of compensation, thereby improving economic efficiency and discouraging wasteful regulatory gamesmanship. These changes would also reconcile private equity compensation with our progressive tax system and widely held principles of distributive justice.

Are All Legal Probabilities Created Equal?

Yuval Feldman, Doron Teichman

At the core of the economic analysis of law lies the concept of expected sanctions, which are calculated by multiplying the severity of the sanction that is applied to wrongdoers by the probability that it will be applied. This probability is the product of several sequential probabilities involving the different actors responsible for sanctioning wrongdoers (e.g., police, prosecutors, judges, jurors, etc.). Generally, legal economists treat different legal probabilities as fungible, simply multiplying them much like any other sequential probabilistic situation. This Article challenges this assumption, demonstrating that people perceive and are affected by different types of legal probabilities in distinct ways. More specifically, it shows that uncertainty associated with the substance of the law and uncertainty associated with imperfect enforcement should not be treated equivalently.

To demonstrate this point, this Article presents a series of between-subjects experimental surveys that measure and compare participants’ attitudes toward compliance in conditions of uncertainty. Study participants—several hundred students from Israel and the United States—answered questions in the context of one of several variations on the same hypothetical scenario. While the expected sanction was the same in each variation, the source of uncertainty differed. These studies confirmed that people are less likely to comply when uncertainty stems from the imprecision of law’s substance than when uncertainty stems from the imperfect enforcement of clear law.

Originalism Is Bunk

Mitchell N. Berman

Critical analysis of originalism should start by confronting a modest puzzle: Most commentators suppose that originalism is deeply controversial, while others complain that it means too many things to mean anything at all. Is one of these views false? If not, how can we square the term’s ambiguity with the sense that it captures a subject of genuine debate? Perhaps self-professed originalists champion a version of originalism that their critics don’t reject, while the critics challenge a version that proponents don’t maintain.

Contemporary originalists disagree about many things: which feature of the Constitution’s original character demands fidelity (framers’ intent, ratifiers’ understanding, or public meaning); why such fidelity is required; and whether this interpretive obligation binds judges alone or citizens, legislators, and executive officials too. But on one dimension of potential variability—the dimension of strength—originalists are mostly united: They believe that those who follow some aspect of a provision’s original character must give that original aspect priority over all other considerations (with a possible exception for continued adherence to non- originalist judicial precedents). That is, when the original meaning (or intent, etc.) is adequately discernible, the interpreter must follow it. This is the thesis that self- professed originalists maintain and that their critics (the non-originalists) deny.

Non-originalists have challenged this thesis on varied wholesale grounds, which include: that the target of the originalist search is undiscoverable or nonexistent; that originalism is self-refuting because the framers intended that the Constitution not be interpreted in an originalist vein; and that originalism yields bad outcomes. This Article proceeds differently. Instead of mounting a global objection—one purporting to hold true regardless of the particular arguments on which proponents of originalism rely—I endeavor to catalogue and critically assess the varied arguments proffered in originalism’s defense.

Those arguments are of two broad types—hard and soft. Originalism is “hard” when grounded on reasons that purport to render it (in some sense) inescapably true; it is “soft” when predicated on contingent and contestable weighings of its costs and benefits relative to other interpretive approaches. That is, hard arguments seek to show that originalism reflects some sort of conceptual truth or follows logi- cally from premises the interlocutor already can be expected to accept; soft arguments aim to persuade others to revise their judgments of value or their empirical or predictive assessments. The most common hard arguments contend that originalism is entailed either by intentionalism or by binding constitutionalism. Soft arguments claim that originalist interpretation best serves diverse values like democracy and the rule of law. I seek to show that the hard arguments for originalism are false and that the soft arguments are implausible.

The upshot is not that constitutional interpretation should disregard framers’ intentions, ratifiers’ understandings, or original public meanings. Of course we should care about these things. But originalism is a demanding thesis. We can take the original character of the Constitution seriously without treating it as dispositive. That original intents and meanings matter is not enough to render originalism true.

The Disutility of Injustice

Paul H. Robinson, Geoffrey P. Goodwin, Michael D. Reisig

For more than half a century, the retributivists and the crime-control instrumentalists have seen themselves as being in an irresolvable conflict. Social science increasingly suggests, however, that this need not be so. Doing justice may be the most effective means of controlling crime. Perhaps partially in recognition of these developments, the American Law Institute’s recent amendment to the Model Penal Code’s “purposes” provision—the only amendment to the Model Code in the forty-eight years since its promulgation—adopts desert as the primary distributive principle for criminal liability and punishment.

That shift to desert has prompted concerns by two groups that, ironically, have been traditionally opposed to each other. The first group—those concerned with what they see as the over-punitiveness of current criminal law—worries that setting desert as the dominant distributive principle means continuing the punitive doctrines they find so objectionable, and perhaps making things worse. The second group—those concerned with ensuring effective crime control—worries that a shift to desert will create many missed crime-control opportunities and will increase avoidable crime.

The first group’s concern about over-punitiveness rests upon an assumption that the current punitive crime-control doctrines of which it disapproves are a reflection of the community’s naturally punitive intuitions of justice. However, as Study 1 makes clear, today’s popular crime-control doctrines in fact seriously conflict with people’s intuitions of justice by exaggerating the punishment deserved.

The second group’s concern that a desert principle will increase avoidable crime exemplifies the common wisdom of the past half-century that ignoring justice in pursuit of crime control through deterrence, incapacitation of the dangerous, and other such coercive crime-control programs is cost-free. However, Studies 2 and 3 suggest that doing injustice has real crime-control costs. Deviating from the community’s shared principles of justice undermines the system’s moral credibility and thereby undermines its ability to gain cooperation and compliance and to harness the powerful forces of social influence and internalized norms.

The studies reported here provide assurance to both groups. A shift to desert is not likely either to undermine the criminal justice system’s crime-control effectiveness, and indeed may enhance it, nor is it likely to increase the system’s punitiveness, and indeed may reduce it.

Debunking the Purchaser Welfare Account of Section 2 of the Sherman Act: How Harvard Brought Us a Total Welfare Standard and Why We Should Keep it

Alan J. Meese

The last several years have seen a vigorous debate among antitrust scholars and practitioners about the appropriate standard for evaluating the conduct of monopolists under section 2 of the Sherman Act. While most of the debate over possible standards has focused on the empirical question of each standard’s economic utility, this Article undertakes a somewhat different task: It examines the normative benchmark that courts have actually chosen when adjudicating section 2 cases. This Article explores three possible benchmarks—producer welfare, purchaser welfare, and total welfare—and concludes that courts have opted for a total welfare normative approach to section 2 since the formative era of antitrust law. Moreover, this Article will show that the commitment to maximizing total social wealth is not a recent phenomenon associated with Robert Bork and the Chicago School of antitrust analysis. Instead, it was the Harvard School that led the charge for a total welfare approach to antitrust generally and under section 2 in particular. The normative consensus between Chicago and Harvard and parallel case law is by no means an accident; rather, it reflects a deeply rooted desire to protect practices—
particularly “competition on the merits”—that produce significant benefits in the form of enhanced resource allocation, without regard to the ultimate impact on purchasers in the monopolized market. Those who advocate repudiation of the longstanding scholarly and judicial consensus reflected in the total welfare approach to section 2 analysis bear the heavy burden of explaining why courts should, despite considerations of stare decisis, suddenly reverse themselves and adopt such a different approach for the very first time, over a century after passage of the Act.

Secondary Considerations in Nonobviousness Analysis: The Use of Objective Indicia Following KSR v. Teleflex

Natalie A. Thomas

One of the basic requirements for patenting an invention is that the invention be
nonobvious. Following the Supreme Court’s decision in Graham v. John Deere,
secondary considerations—also known as objective indicia of nonobviousness—
have been considered when determining whether an invention is nonobvious. Secondary
considerations provide tangible evidence of the economic and motivational
issues relevant to the nonobviousness of an invention. Types of secondaryconsiderations
evidence include commercial success, long-felt but unmet need, and
copying by competitors. For many years, the Federal Circuit’s teaching, suggestion,
or motivation test often eliminated the need for the court to rely on secondary considerations
in the obviousness inquiry. Due to the Federal Circuit’s stringent application
of this test, the obviousness inquiry was generally resolved by examining the
prior art.
In 2007, the Supreme Court decided KSR v. Teleflex, which endorsed a flexible
obviousness analysis and rejected the Federal Circuit’s strict application of the
teaching, suggestion, or motivation test. Following KSR, scholars predicted that
secondary-considerations evidence would provide a critical tool for patentees
seeking to demonstrate the nonobviousness of an invention. Inspired by that prediction,
this Note evaluates how secondary-considerations evidence has been utilized
in the first few years post-KSR. It finds that the Federal Circuit has continued to
impose stringent relevancy requirements on the use of secondary-considerations
evidence, and that it remains difficult for patentees to employ secondary considerations
in favor of a nonobviousness conclusion. Specifically, secondaryconsiderations
evidence has not been used with much success outside of pharmaceutical
patent cases. More often than not, the Federal Circuit has summarily dismissed
secondary-considerations evidence as insufficient in cases involving
mechanical arts patents. This Note concludes by suggesting that the Federal
Circuit’s current practice for using secondary considerations should inform proposals
by scholars for industry-specific tailoring of the patent system and patent
law’s use of secondary considerations, and that the Federal Circuit should continue
to engage with secondary-considerations evidence in order to provide more guidance
to lower courts during the post-KSR transition period.

Innovations on the Cutting Edge of Ariad: Reinventing the Written Description Requirement

Jonathan E. Barbee

For the great majority of its history, the written description requirement was an
often-ignored relic of the patent statute. As technology advanced, the written
description requirement developed teeth as a means for invalidating patent claims
during litigation. Written description doctrine reached its peak in Ariad
Pharmaceuticals, Inc. v. Eli Lilly & Co.
, when the Federal Circuit created a significant
setback for groundbreaking innovation. Ariad demonstrated that the written
description doctrine lacked sufficient recognition of the fundamental policies and
purposes of the patent system and that this could have serious consequences for
innovation. This Note attempts to rectify the written description doctrine by
reorienting the doctrine in innovation policy. To do so, I first apply an alternative
version of the “prospect theory” of patents to conventional patent policy. Based on
this policy calculus, I then devise a reformed hypothetical innovation test that looks
outside of the “four corners” of the patent and considers the larger impact that the
written description has on the patent system. Without such doctrinal reform, the
written description doctrine of Ariad and its legacy risks undermining the incentives
that motivate inventors to undertake cutting-edge technology.

Madison Lecture: Living Our Traditions

The Honorable Robert H. Henry

In the annual James Madison Lecture, Robert Henry, former Chief Judge of the
United States Court of Appeals for the Tenth Circuit, explores Justice John
Marshall Harlan II’s notable dissent in Poe v. Ullman. President Henry carefully
examines Justice Harlan’s method of constitutional interpretation. Refusing to
adopt a “literalistic” reading of the Constitution and instead looking to the “history
and purposes” of a particular constitutional provision, Justice Harlan’s approach
serves as a source of both flexibility and restraint. Of particular importance is
Justice Harlan’s recognition of the role that “living” traditions play in supplying
meaning to the concept of due process of law. What emerges from this probing
review of Justice Harlan’s Poe dissent is a moderate and thoughtful response to
originalism.

Demsetz Underground: Busking Regulation and the Formation of Property Rights

James Graham Lake

The Metropolitan Transit Authority regulates busking—playing music or performing for tips in a public place—differently depending on the subway station. Some stations are reserved for members of a program called Music Under New York (MUNY), while at the others, anyone willing to pay the standard fare to enter the station is allowed to busk. As it happens, the distribution of MUNY and non- MUNY stations within the subway system follows an economic pattern. MUNY covers the stations where we should expect busking to impose the highest externality costs. This economic pattern of coverage provides the substantive basis for this Note: Because MUNY’s distribution is consistent with Harold Demsetz’s foundational theory about the economic development of private property rights, MUNY provides a window into a question left open by Demsetz and contested in subsequent literature—the question of how private property develops. This Note analyzes MUNY to make two contributions to the growing body of literature describing how property rights develop. First, observing the role that changing First Amendment doctrine played in MUNY’s formation, this Note argues that exogenous legal norms act as constraints on the mechanisms through which new property rights develop. Second, it argues that Demsetz’s theory should take account of the inertia built into property systems and the external shocks that help overcome this stasis.