Saturday, November 3, 2012

Tower of Babel, the Semantics Initiative, and Ontology

What's in a name? That which we call a rose by any other name would smell as sweet. ~ William Shakespeare 
The beginning of wisdom is to call things by their right names. ~ Chinese Proverb

At a symposium held by the Securities Industry and Financial Markets Association (SIFMA) in March 2012, Andrew G. Haldane, Executive Director of Financial Stability for the Bank of England, gave a speech titled, “Towards a common financial language”.[1] Using the imagery of the Tower of Babel, Mr. Haldane described how…
Finance today faces a similar dilemma. It, too, has no common language for communicating financial information. Most financial firms have competing in-house languages, with information systems silo-ed by business line. Across firms, it is even less likely that information systems have a common mother tongue. Today, the number of global financial languages very likely exceeds the number of global spoken languages.

The economic costs of this linguistic diversity were brutally exposed by the financial crisis. Very few firms, possibly none, had the information systems necessary to aggregate quickly information on exposures and risks.[2] This hindered effective consolidated risk management. For some of the world’s biggest banks that proved terminal, as unforeseen risks swamped undermanned risk systems.

These problems were even more acute across firms. Many banks lacked adequate information on the risk of their counterparties, much less their counterparties’ counterparties. The whole credit chain was immersed in fog. These information failures contributed importantly to failures in, and seizures of, many of the world’s core financial markets, including the interbank money and securitization markets.

Why is this? One would think that the financial industry would be in a great position to capitalize on the growth of digital information. After all, data has been the game changer for decades. But Wall Street, while proficient at handling market data and certain financial information, is not well prepared for the explosion in unstructured data.

The so-called “big data” problem of handling massive amounts of unstructured data is not just about implementing new technologies like Apache Hadoop. As discussed at the CFTC Technology Advisory Committee on Data Standardization held September 30, 2011, there is significant confusion in the industry regarding “semantics”.[3]

EDM Council - FIBO Semantics Initiative

The “semantic barrier” is a major issue in the financial industry, necessitating the creation of standards such as ISO 20022 to resolve.[4] For example, what some participants in the payments industry call an Ordering Customer, others refer to a Payer or Payor, while still others refer to a Payment Originator or Initiator. The context also plays a role here: the Payment Originator/ Initiator is a Debtor/ Payor in a credit transfer, while that Payment Originator/Initiator is a Creditor/Payee in a direct debit.[5]

It should therefore be apparent that intended use of systems is reliant on “human common sense” and understanding. Unfortunately, especially within the context of large organizations or across an industry, boundaries of intended use are often not documented and exist as “tribal knowledge”. Even if well documented, informal language maintained in policies and procedures can result in unintentional misapplication, with consequences no less hazardous than intentional misapplication.


Overcoming semantic barriers...

If your avocation[6] involves organizing information and/or modeling data and systems you invariably start asking epistemo-logical[7] questions, even though such questions may not be immediately practical to the task at hand: What is knowledge? How is knowledge acquired? To what extent is it possible for a given concept, either physical or abstract, to be known? Can computers understand meaning from the information they process and synthesize knowledge? “Can machines think?”[8]

Such questions are impetus to an ongoing debate about “the scope and limits of purely symbolic models of the mind and about the proper role of connectionism in cognitive modeling.” Harnad (1990) proffered such quandary as the Symbol Grounding Problem. “How can the semantic interpretation of a formal symbol system be made intrinsic to the system, rather than just parasitic on the meanings in our heads?”[9] The meaning triangle[10] illustrates the underlying problem.

 
Figure 1 – Ogden and Richards (1923) meaning triangle 

Figure 1 is a model of how linguistic symbols stand for objects they represent, which in turn provide an index to concepts in our mind. Note too, that such triangle represents the perspective of only one person, whereas communication often takes place between two or more persons (or devices such as computers). Hence, in order for two people or devices to understand each other, the meaning that relates term, referent and concept must align.

Now consider that different words might refer to the same concept, or worse, the same word could have different meanings, as in our example of the term “orange”. Are we referring to a fruit or to a color? This area of study is known as semantics.[11]


Relating semantics to ontology

As the meaning triangle exemplifies, monikers—whether they be linguistic[12] or symbolic[13]—are imperfect indexes. They rely on people having the ability to derive denotative (ie, explicit) meaning, and/or connotative (ie, implicit) meaning from words/signs. If the encoder (ie, sender) and the decoder (ie, receiver) do not share both the denotative and connotative meaning of a word/sign, miscommunication can occur. In fact, at the connotative level, context determines meaning.

Analytic approaches to this problem falls under the domain of semiotics,[14] which for our purposes encompasses the study of words and signs as elements of communicative behavior. Consequently, we consider linguistics and semiosis[15] to come under the subject of semiotics.[16] Semiotics, in turn, is divided into three branches or subfields: (i) semantics; (ii) syntactics;[17] and (iii) pragmatics.[18]

Various disciplines are used to model concepts within this field of study. These disciplines include, but are not necessarily limited to, lexicons/synsets, taxonomies, formal logic, symbolic logic, schema related to protocols (ie, syntactics), schema related to diagrams (ie, semiosis), actor-network theory, and metadata [eg, structural (data about data containers), descriptive (data about data content)]. In combination, these various methods form the toolkit for ontology work.

Ontology involves the study of the nature of being, existence, reality, as well as the basic categories of being and their relations. It encompasses answering metaphysical[19] questions relating to quiddity, that is, the quality that makes a thing what it is—the essential nature of a thing.

Admittedly, there are divergent views amongst practitioners as to what constitutes ontology, as well as classification of semiotics and related methodologies. To be sure, keeping all these concepts straight in one’s mind is not without difficulty for those without formal training. Further, “ontology has become a prevalent buzzword in computer science. An unfortunate side-effect is that the term has become less meaningful, being used to describe everything from what used to be identified as taxonomies or semantic networks, all the way to formal theories in logic.”[20]

Figure 2 is a schematic diagram illustrating a hierarchical conceptualization of ontology, its relation to epistemology, metaphysics, and semiotics, as well as its relation to cognitive science. Figure 2 also shows how semiotics encompasses linguistics and semiosis.

Figure 2 – Conceptualization of epistemology, metaphysics, ontology, and semiotics


Knowledge representation and first order logic

Knowledge representation (KR) is an area of artificial intelligence research aimed at representing knowledge in symbols to facilitate systematic inferences from knowledge elements, thereby synthesizing new elements of knowledge. KR involves analysis of how to reason accurately and effectively, and how best to use a set of symbols to represent a set of facts within a knowledge domain.

A key parameter in choosing or creating a KR is its expressivity. The more expressive a KR, the easier and more compact it is to express a fact or element of knowledge within the semantics and syntax of that KR. However, more expressive languages are likely to require more complex logic and algorithms to construct equivalent inferences. A highly expressive KR is also less likely to be complete and consistent; whereas less expressive KRs may be both complete and consistent.

Recent developments in KR include the concept of the Semantic Web, and development of XML-based knowledge representation languages and standards, including Resource Description Framework (RDF), RDF Schema, Topic Maps, DARPA Agent Markup Language (DAML), Ontology Inference Layer (OIL), and Ontology Web Language (OWL).[21]

 
Figure 3 – Adapted from Pease (2011) [Figure 15] and Orbst (2012)

SUMO, an open-source declarative programming language based on first order logic,[22] resides on the higher end of the scale in terms of both formality and expressiveness. The upper level ontology of SUMO consists of ~1120 terms, ~4500 axioms and ~795 rules, and has been extended with a mid-level ontology (MILO) as well as domain specific ontologies. Written in the SUO-KIF language, it is the only formal ontology that has been mapped to all of WordNet lexicon.

Formal languages such as DAML, OIL, and OWL are geared towards classification. What makes SUMO unique from other types of modeling approaches (eg, UML or frame-based), is its use of predicate logic. SUMO preserves the ability to structure taxonomic relationships and inheritance, but then extends such techniques with an expressive set of terms, axioms and rules that can more accurately model geo-spatial, sequentially temporal concepts, both physical and abstract.

Nevertheless, KR modeling can suffer from the “garbage in, garbage out” syndrome. Developing domain ontologies with SUMO is no exception. That is why in a large ontology such as SUMO/MILO, validation is very important.[23]


Models of concepts are second derivatives

Returning to Ogden’s and Richards’ (1923) meaning triangle, the relations between term, referent and concept may be phrased more precisely in causal terms:
  • The matter (referent) evokes the writer's thought (concept). 
  • The writer refers the matter (referent) to the symbol (term). 
  • The symbol (term) evokes the reader's thought (concept). 
  • The reader refers the symbol (concept) back to the matter (referent).

When the writer refers the matter to the symbol, the writer is effectively modeling the referent. The method that is used is informal language. However, without a formal semantic system in which to model concepts, the use of natural language as a representation of concepts will suffer from the issue that informal languages have meaning only by virtue of human inter-pretation of words. Likewise, it is important to not confuse the term as a substitute for the referent itself.

In calculus the second derivative of a function ƒ is the derivative of the derivative of ƒ. Likewise, a KR archetype or replica of a referent (ie, a physical or abstract thing) can be considered a second derivative, whereby the concept is the first derivative, and the model of the concept is the second derivative. What can add to the confusion is the term labeling the referent, versus the term labeling the model of the concept of the referent. One's inclination is to substitute the label for the referent.

Figure 4 – Adapted from Sowa (2000), “Ontology, Metadata, and Semiotics” 

Thus, it is important to recognize that a representation of a thing is at its most fundamental level still a surrogate, a substitute for the thing itself. It is a medium of expression. In fact, “the only model that is not wrong is reality and reality is not, by definition, a model.”[24] Still, a pragmatic method for addressing this concern derives from development and use of ontology.


Unstructured data and a way forward…

Over the past two decades much progress has been made on shared conceptualizations and the theory of semantics, as well as the application of these disciplines to advanced computer systems. Such progress has provided the means to solve the problem of deriving meaningful information from the unstructured schema size explosion (ie, “big data”) overwhelming the banking industry, as well as the many other industries which suffer from the same issue.

The missing link underlying “the cause of many a failure to design a proper dialect... [is] the general lack of an upper ontology that could provide the basis for mid-level ontologies and other domain specific metadata dictionaries or lexicons.” The key then is use of an upper-level ontology that “gives those who use data modeling techniques a common footing to stand on before they undertake their tasks.”[25] SUMO, as an open source formal ontology (think Linux), is a promising technology for such purpose with an evolving set of tools and an emerging array of applications/uses solving real world problems.

As Duane Nickull, Senior Technology Evangelist at Adobe Systems explained, “[SUMO] provides a level setting for our existence and sets up the framework on which we can do much more meaningful work.”[26]


About SUMO:
The Suggested Upper Merged Ontology (SUMO) and its domain ontologies form the largest formal public ontology in existence today. They are being used for research and applications in search, linguistics and reasoning. SUMO is the only formal ontology that has been mapped to all of the WordNet lexicon. SUMO is written in the SUO-KIF language. Sigma Knowledge Engineering Environment (Sigma KEE) is an environment for creating, testing, modifying, and performing inference with ontologies developed in SUO-KIF (e.g., SUMO, MILO). SUMO is free and owned by the IEEE. The ontologies that extend SUMO are available under GNU General Public License. Adam Pease is the Technical Editor of SUMO.  

For more information: http://www.ontologyportal.org/index.html. Also see: http://www.ontologyportal.org/Pubs.html for list of research publications citing SUMO.

Users of SUO-KIF and Sigma KEE consent, by use of this code, to credit Articulate Software and Teknowledge in any writings, briefings, publications, presentations, or other representations of any software that incorporates, builds on, or uses this code. Please cite the following article in any publication with references:

Pease, A., (2003). The Sigma Ontology Development Environment. In Working Notes of the IJCAI-2003 Workshop on Ontology and Distributed Systems, August 9, 2003, Acapulco, Mexico.



Footnotes:
[1] Speech by Mr Andrew G Haldane, Executive Director, Financial Stability, Bank of England, at the Securities Industry and Financial Markets Association (SIFMA) “Building a Global Legal Entity Identifier Framework” Symposium, New York, 14 March 2012.

[2] Counterparty Risk Management Policy Group (2008).

[3] CFTC Technology Advisory Subcommittee on Data Standardization Meeting to Publicly Present Interim Findings on: (1) Universal Product and Legal Entity Identifiers; (2) Standardization of Machine-Readable Legal Contracts; (3) Semantics; and (4) Data Storage and Retrieval. Meeting notes by Association of Institutional Investors. Source: http://association.institutionalinvestors.org/

[4] See: http://www.iso20022.org/

[5] SWIFT Standards Team, and Society for Worldwide Interbank Financial Telecommunication (2010). ISO 20022 for Dummies. Chichester, West Sussex, England: Wiley. http://site.ebrary.com/id/10418993.

[6] The term “avocation” has three seemingly conflicting definitions: 1. something a person does in addition to a principal occupation; 2. a person's regular occupation, calling, or vocation; 3. Archaic diversion or distraction. Note: we purposefully selected this terms because it relate to pragmatics; specifically, the “semantic barrier”.

[7] e•pis•te•mol•o•gy, n., 1. a branch of philosophy that investigates the origin, nature, methods, and limits of human knowledge; 2. the theory of knowledge, esp the critical study of its validity, methods, and scope.

[8] Turing, A.M. (1950). “Computing machinery and intelligence” Mind, 59, 433-460.

[9] Harnad, Stevan (1990) “The Symbol Grounding Problem” Physica, D 42:1-3 pp. 335-346.

[10] Ogden, C., and Richards, I. (1923). The meaning of meaning. A study of the influence of language upon thought and of the science of symbolism. Supplementary essays by Malinowski and Crookshank. New York: Harcourt.

[11] se•man•tics, n., 1. linguistics the branch of linguistics that deals with the study of meaning, changes in meaning, and the principles that govern the relationship between sentences or words and their meanings; 2. significs the study of the relationships between signs and symbols and what they represent; 3. logic a. the study of interpretations of a formal theory; b. the study of the relationship between the structure of a theory and its subject matter; c. the principles that determine the truth or falsehood of sentences within the theory. 

[12] lin•guis•tics, n., the science of language, including phonetics, phonology, morphology, syntax, semantics, pragmatics, and historical linguistics.

[13] Use of term “symbolic” refers to semiosis and the term “sign,” which is something that can be interpreted as having a meaning for something other than itself, and therefore able to communicate information to the person or device which is decoding the sign. Signs can work through any of the senses: visual, auditory, tactile, olfactory or taste. Examples include natural language, mathematical symbols, signage that directs traffic, and non-verbal interaction such as sign language. Note: we categorized linguistics to be a subclass of semiotics.

[14] se•mi•ot•ics, n., the study of signs and sign processes (semiosis), indication, designation, likeness, analogy, metaphor, symbolism, signification, and communication. Semiotics is closely related to the field of linguistics, which, for its part, studies the structure and meaning of language more specifically.

[15] The term “semiosis” was coined by Charles Sanders Peirce (1839–1914) in his theory of sign relations to describe a process that interprets signs as referring to their objects. Semiosis is any form of activity, conduct, or process that involves signs, including the production of meaning. [Related concepts: umwelt, semiosphere]

[16] One school of thought argues that language is the semiotic prototype and its study illuminates principles that can be applied to other sign systems. The opposing school argues that there is a meta system, and that language is simply one of many codes (ie, signs) for communicating meaning.

[17] syn•tac•tic, n., 1. the branch of semiotics that deals with the formal properties of symbol systems. 2. logic, linguistics the grammatical structure of an expression or the rules of well-formedness of a formal system.

[18] prag•mat•ics, n. 1. logic, philosophy the branch of semiotics dealing with causalality and other relations between words, expressions, or symbols and their users. 2. linguistics the analysis of language in terms of the situational context within which utterances are made, including the knowledge and beliefs of the speaker and the relation between speaker and listener. [Note: pragmatics is closely related to the study of semiosis.]

[19] met•a•phys•ics, n., 1. the branch of philosophy that treats of first principles, includes ontology and cosmology, and is intimately connected with epistemology; 2. philosophy, especially in its more abstruse branches; 3. the underlying theoretical principles of a subject or field of inquiry.

[20] Pease, Adam. (2011). Ontology: A Practical Guide. Angwin: Articulate Software Press.

[21] A review of the listed KR approaches is outside the scope of this discussion. See Pease (2011), Ontology: A Practical Guide, ‘Chapter 2: Knowledge Representation’ for a more in-depth discussion/comparison.

[22] While OWL is based on description logic, its primary construct is taxonomy (ie, frame language).

[23] See Pease (2011), Ontology: A Practical Guide, pp. 89-91 for further discussion on validation.

[24] Haldane, Andrew G. (2009). “Why banks failed the stress test” Speech, Financial Stability, Bank of England, at the Marcus-Evans Conference on Stress-Testing, London, 9-10 February 2009.

[25] Duane Nickull, Senior Technology Evangelist, Adobe Systems; foreword to “Ontology” by Adam Pease.

[26] Multiple contributors (2009). Introducing Semantic Technologies and the Vision of the Semantic Web Frontier Journal, Volume 6, Number 7 July 2009. See: http://www.hwswworld.com/pdfs/frontier66.pdf 

Monday, October 29, 2012

Dodd-Frank: Is smart politics, smart business?

It may be smart politics to fight Dodd-Frank, but is it smart business? Throughout the primary and general election season, Republicans have repeatedly invoked the law’s 848-page girth—and its rules on, among other things, trading derivatives and swaps—as a symbol of government overreach that is killing jobs.[1]

As noted by Michael Greenberger, professor at the University of Maryland's Francis King Carey School of Law, tactics used to try and stop Dodd-Frank include attempts at blocking its passage, starving regulators financially so the law cannot be enforced, and most recently, challenging the final rules with a flood of lawsuits in federal courts claiming that regulators have used improper cost-benefit analyses.[2]

There seems to be two major themes underlying Wall Street’s resistance. The first is the cost Dodd-Frank will impose on certain institutions’ existing business models, exposing these firms to either more competition, or rendering certain lucrative ways of doing business no longer viable. The second is the cost of implementing and/or upgrading technology to properly support the deluge of new requirements, some which contemplate the building of infrastructure that currently does not exist. 77 FR 21278 - Customer Clearing Documentation, Timing of Acceptance for Clearing, And Clearing Member Risk ...

The evidence of the fight to reduce Dodd-Frank’s impact on derivatives trading is scattered throughout the regulations promulgated by the CFTC. The final rules contain a summary of comments by industry participants and discussion of the CFTC's views in response. Take for example the discussion surrounding customer clearing documentation and trilateral agreements…

Six commentators [3] went into detail why trilateral agreements are bad for the markets, noting that such agreements discourage competition and efficient pricing, compromise anonymity, reduce liquidity, increase the time between execution and clearing, introduce conflicts of interest, and prevent the success of swap execution facilities (SEFs).

Opposing this view were many of the major banks [4] who contend that without the trilateral agreements some market participants may have reduced access to markets. The banks suggest that "instead of prohibiting trilateral agreements, the CFTC could require that the allocation of credit limits across executing counterparties be specified by the customer, rather than the futures commission merchant (FCM), who would confirm the customer’s allocation to the identified executing counterparties."

Contrary to such protests the CFTC asserts that the rules do not prohibit trilateral agreements; rather, they prohibit certain provisions contained in trilateral or bilateral agreements. Further, the CFTC emphasizes that nothing in these rules would restrain a swap dealer (SD) or major swap participant (MSP) from establishing bilateral limits with each of its counterparties, much less impair a SD’s or MSP’s ability to conduct due diligence on each of its counterparties.

In fact, rather than discouraging competition, the law prohibits an SD or MSP from adopting any process that imposes any material anti-competitive burden on trading or clearing. In addition, derivatives clearing organization (DCO) rules provide for the non-discriminatory clearing of swaps.

This would seem conceptually amenable, but it is argued that pre- and post-trade uncertainty caused by a delay between the time of trade execution and the time of trade acceptance into clearing, would undermine market integrity, and by implication impede liquidity, efficiency and market stability.

Accordingly, the CFTC revised language to clarify that, for swaps that will be submitted for clearing, an SD or MSP may continue to manage its risk by limiting its exposure to the counterparty with whom it is trading. This clarification is intended to both emphasize the need to conduct appropriate risk management, as well as address the concern that until straight through processing is achieved, SDs and MSPs will still need to manage risk to a counterparty before a trade is accepted or rejected for clearing.

And therein lies the crux of the matter. For prompt and efficient clearing to occur, the rules, procedures and operational systems of the trading platform and the clearinghouse must align. Vertically integrated trading and clearing systems currently process high volumes of transaction quickly and efficiently. But they also form a monopoly.

Under the distributed structure contemplated by Title VII, each SEF and designated contract market (DCM) is required to assure equal access to all DCOs that wish to clear trades executed through the facilities of the SEC or DCM.

The technological issue then is minimizing the time between trade execution and acceptance into clearing. This time lag potentially presents credit risk to the swap counterparties, clearing members, and the DCO because the value of a position may change significantly between the time of execution and the time of novation, thereby allowing financial exposure to accumulate in the absence of daily mark-to-market.

Thus, what is not often discussed in the political furor over Dodd-Frank is how this legislation is driving industry participants toward “prompt, efficient, and accurate processing of trades” while simultaneously encouraging a competition. An initiative to improve and better integrate front to back office processing on such a large scale has not been seen by the industry since the “paper crunch” of the 1970s, and the passing of the Securities Act Amendments of 1975. In our opinion, it’s about time.

Footnotes:

[1] Edward Wyatt, Dodd-Frank Act a Favorite Target for Republicans Laying Blame, New York Times, September 20, 2011 

[2] Michael Greenberger, Will Wall Street prevail? The Baltimore Sun, October 8, 2012. 

[3] The Alternative Investment Management Association Ltd; Javelin Capital Markets, LLC; Societe Generale; Asset Management Group of the Securities Industry and Financial Markets Association (SIFMA); Spring Trading, Inc.; Vanguard. 

[4] Bank of America; Merrill Lynch; BNP Paribas; Citi; Credit Suisse Securities (USA) LLC; Goldman Sachs; HSBC; J.P. Morgan; Deutsche Bank; Edison Electric Institute; ISDA; Morgan Stanley; Societe Generale; UBS Securities LLC.

Friday, October 19, 2012

Thoughts on ISDA SIFMA v. U.S. CFTC

In reading Judge Wilkins Memorandum Opinion regarding Civil Action No. 11-cv-2146 (RLW), in which ISDA and SIFMA challenged the CFTC on position limits on derivatives tied to 28 physical commodities, we are reminded of an insightful line from a paper on behavioral finance:
"What has happened is that we've used these assumptions for so long that we've forgotten that we've merely made assumptions, and we've come to believe that the world is necessarily this way." ISDA SIFMA v US CFTC - Civil Action 11-Cv-2146 - Memorandum Opinion

The underlying flaw in the law is the supposition that "excessive speculation" is an eye-of-the-beholder standard, not black letter law. The same can be said of the phrase, "undue and unnecessary burden on interstate commerce". Thus, in a semantic tour de force the ruling reveals two insights into the rule making process.

First is the reliance on economic thought, and how philosophical disagreements amongst economists within the so-called "dismal science" is the "gift that keeps on giving" with respect to obfuscating rule-making intent. In an "appeal to authority" Judge Wilkins references various CFTC Commissioners prior statements forecasting the Plaintiff's argument as to the need for "statutorily-required findings of necessity prior to promulgating the Position Limits Rule". Given that the veracity of "position limits" ability to constrain "excessive speculation" is something economists will dispute infinitum, the seeds of the Rule's destruction is fait accompli.

Second, given that the law states that limits for exempt commodities are required to be established within 180 days after July 21, 2010, and that limits for agricultural commodities are required to be established within 270 days after July 21, 2010, the CFTC was faced with a conundrum. How can it both comply with the law's deadline requirement, and serve the needs of its constituency which generally stands against imposition of position limits [see comment letters]?

The answer seems to have been in the CFTC's decision to avoid first performing "any reliable economic analysis," as suggested by Commissioner Dunn, lending support to the Plaintiffs claims:
  1. Violation of the CEA and APA--Failure to Determine the Rule to be Necessary and Appropriate under 7 U.S.C. $ 6a(a)(1), a(2)(A), (a)(5)(A))
  2. Violation of the CEA--Insufficient Evaluation of Costs and Benefits under 7 U.S.C. $ 19(a) 
  3. Violation of the APA--Arbitrary and Capricious Agency Action in Promulgating the Position Limits Rule 
  4. Violation of the APA--Arbitrary and Capricious Agency Action in Establishing Specific Position Limits and Adopting Related Requirements and Restrictions 
  5. Violation of the APA--Failure to Provide Interested Persons A Sufficient Opportunity to Meaningfully Participate in the Rulemaking 
The above arguments are not a one-off situation. In many of the Final Rules promulgated by the CFTC, the agency has not engaged in cost-benefit analysis. Depending on which way the election goes, one can assume that if President Obama wins re-election, in the years ahead we'll be seeing more of these kinds of cases being brought before court with this ruling providing precedent. Should that scenario prevail, we may not see full implementation of Title VII for many years, if ever.

The irony for the industry is the ongoing uncertainty surrounding the applicability of the CFTC's Final Rules involving Title VII. In other words, what may suffer most from an industry strategy to undermine Dodd-Frank through the courts is SIFMA's own claimed mission, that of "building trust and confidence in the financial markets".

Regardless, we think that this ruling's potential impact on all other Final Rules issued by the CFTC is significant. It sets precedence and legitimizes an avenue of attack to pull the rug from CFTC's reliance on a 1981 rulemaking upon which it assumed that cost-benefit analysis can be avoided.