### Jonathan Fine

[completed 2008-12-14]

Jonathan Fine is a long time participant in the TeX community. He is current chair of the UK TeX Users Group and is employed supporting TeX at the Open University.

Jonathan Fine, interviewee:     At school I was good at mathematics, which pleased my father, and I studied mathematics and philosophy at university. I then wrote a PhD in pure mathematics. I wrote it by hand, and the technical typists then typed it up, using the IBM golf-ball typewriter. Special symbols were obtained by changing the golf-ball print-head. This was the standard way of doing things then. In the prior generation, maths PhDs were typed in the usual way, with space into which the formulas were handwritten.

DW:     I'm curious — what was the title of your thesis?

JF:     The title is not important. I proved some results on the resolution and completion of algebraic varieties. In other words, it was in algebraic geometry. Since then I've done some work using the connection between toric algebraic varieties and convex polytopes. I've also done some work on the Vassiliev-Kontsevich knot invariants. But I've not published much, and so didn't have a research career.

I've taught mathematics in some colleges in the USA, spent time doing research (but with few results or publications to date), spent time consulting, and worked writing software in both commerce and publishing.

Since 2003 I've been working as a TeX expert at The Open University, which is the UK's leading provider of distance learning. My main work there is to maintain and develop the TeX system that is used for producing course materials for the maths courses and upper-level physics courses. Lately, partly with external funding, I've been doing a lot with mathematics on web pages, and TeX as a web service.

I enjoy spending time with family, which often means traveling. For example, this summer I went to Greece, for the first time, for my nephew Joel's wedding. His mother-in-law is Greek, and he's making progress with the language. He and his wife live in Brussels.

Apart from the wedding and meeting my family, one of the high points (and in a sense literally) was taking the train from Athens to Thessaloniki. This journey went through the mountains in the north, and at one point Mount Olympus was on the left and the sea on the right. I also got a great deal out of visiting one of the roots of our culture. There seemed to be a real awakening of the human spirit and society at that time.

Over the past 10 years or so I've benefited greatly from the teachings of Vietnamese Zen master Thich Nhat Hanh and the monks and nuns at the monastery Plum Village that he established in France. I find the practice rewarding, but also at times difficult, particularly when caught up in the stresses and tensions of software development and support. I do wish I could communicate better.

DW:     When and how did you first get involved with TeX?

JF:     In 1985 I was working in Boston in the USA

DW:     I've lived in Boston for 45 years; I hope you liked our city.

JF:     One of my favourite places in Boston was the Museum of Fine Arts. I also liked the Arnold Arboretum in Jamaica Plain. And there were many interesting places to eat. But this is all a long time ago now, and I've not been back for many years.

DW:     Excuse my interruption. You were saying

JF:     There was an interesting event at the Computer Museum in Boston. It was Donald Knuth giving a talk for a “coming out” party for TeX and the publication of his series of books Computers and Typesetting, the first volume of which was The TeXbook. It was rather a miserable winter evening, cold and raining, and much to my regret I decided to stay at home. So my first encounter was a near miss.

The following year I was writing a paper, with help from an expert, using the Unix troff system, and a colleague gave a talk on the new mathematics typesetting that the American Mathematical Society were adopting, and he showed us how he encoded the University letterhead (which used only text) using this new system, called TeX. I was impressed, but continued to use troff to finish the paper.

The year after that, 1987, I was at the Indiana University, and their mathematics journal was produced using TeX. I got to know the technical editor for this journal, and she introduced me to TeX. Indiana University also had a scheme where staff could buy TeX for PCs at a discount. So I got a TeX distribution on many 360k disks, to run on the new PC I bought especially to run TeX. As I recall, mostly it was ArborText software.

The PC, by the way, was a 10MHz 80286 with a 40MB hard disk and 1MB of memory. Pretty soon, I bought another 1MB of memory and the excellent DesqView multitasking platform for running MS-DOS programs. I also bought a 300 dpi HP LaserJet II. At that time, TeX was a demanding program, and I need both the storage (mostly for bitmap fonts) and the processing power.

DW:     But now you have a job as a TeX expert. Did you actually use TeX in ever more sophisticated ways over the years, or did you just get hooked on TeX or typesetting as a sort of complex hobby as some TeX experts seem to have done?

JF:     I did get hooked by TeX, because of the excellence of its output, the absence of bugs, and its unique ability to typeset mathematics. And like many others, I spent many hours learning the TeX macro language.

For some time I did try to do more sophisticated ways of making TeX do things, by writing complicated macros such as Active TeX. I still have to maintain and write complex macros as part of my work, but I'd like to move over more to something like the PostScript model.

I'll explain what I mean. Drawing programs that generate PostScript output rely on a static header file that wraps the PostScript primitives into something a bit more usable, and a program that manages data structures and fonts and the like, and which emits intermediate level PostScript (as defined by the header). The admirable dvips PostScript driver for TeX's DVI files does exactly this. One could write dvips almost entirely in PostScript if one wanted to, but that would not be sensible.

No-one writes a drawing program in PostScript, although it could be done. We know that PostScript is not the right language for that sort of task. But TeX experts try to write complex output routines using TeX macros, and then argue that because it's so hard to do this, we should develop and use a TeX extension instead.

I think the best way to use TeX (or some extension) is as a module that extends some well-established scripting language. There are two major challenges here. First, interfacing the high-level language with the typesetting engine, and second, running typesetting engine as a daemon.

By the way, in one of his interviews Don Knuth said that it was his intention with TeX to write “just a typesetting language”, and that many macro language features were added only as a result of pressure from users.

I've solved the second problem already, as part of the MathTran project, which makes TeX available as a high-performance web service.

DW:     After I joined TUG, my first memory of hearing of you had to do with Active TeX. Please tell me about the motivation and history of that.

JF:     There are two sides to Active TeX, namely a programming language and a means of user input. At that time, like everyone else in the TeX community, I thought that the way to make TeX do new things was to write TeX macros. The idea of using a modern scripting language, such as Perl, Python, Ruby or Lua, was not widespread. In fact, Perl and Python only reached maturity in 1994 and 2000 or so respectively.

TeX is a wonderful typesetting program, but its macro language is not suitable for writing complicated programs. Active TeX makes TeX easier to program, but few people were interested in using it, and over time I lost interest myself.

Here's an example of Active TeX. Most languages have named parameters to functions. Well TeX doesn't have functions, but it does have macros. But its macros don't have named parameters. With Active TeX you could write

def mymacro #width #text {
hbox "to" width { text }
}

and the effect would be the same as
\def\mymacro #1#2{\hbox to#1{#2}}%

which is not nearly so easy to write or to read.

On the input side, TeX macro packages from Knuth onwards (and this includes both LaTeX and ConTeXt) use changes of what are known as category codes to allow verbatim (or unescaped) input. As a result, you can't use \verb in LaTeX arguments. For example

\section{The \verb|\iffalse| command}

will bomb out, even though \verb works in ordinary paragraph text.

Today, many people use wiki languages. One could write a wiki-language parser using Active TeX, but I wouldn't recommend that now. I'd use a modern scripting language (Python is my favourite) to translate the input into fairly low-level TeX commands.

In case you want to know how it works, Active TeX makes all characters active, and it gives you tools for giving each character the meaning you want.

JF:     I've been around for a long time. I joined both TUG and UK-TUG in 1990 or so. I've served on the Committee of UK-TUG for several years, and in 2006 I was elected chair. I've just been re-elected, to serve to 2010. At our last AGM we adopted a new constitution, which will help us a lot.

In 1995 I was the main organiser of a very successful UK-TUG meeting in London on “TeX, SGML and PDF”. (This was before the invention of XML.) It was a sell-out, with about 120 delegates. Unfortunately, the UK-TUG committee did not build on the momentum created.

Today, UK-TUG is rather weaker than it was then. We're managing to keep things ticking over, and to organise a meeting once a year. I'd like to have more time for this, but my main focus is on software development.

I would like, next year, to provide some new online resources for TeX documentation and training.

DW:     You mentioned your MathTran project earlier. I have seen the video of your presentation of your MathTran project from the last TUG annual conference. Will you please say a few more words about it, its purpose, and its implementation.

JF:     MathTran makes TeX available as a high-performance web service. You send it a url, with TeX-encoded mathematics in the query string, and you get back the typeset formula as a bitmap graphic. This allows you to include graphics in your web-page without having to install TeX.

Google Charts provides a similar web service, except it returns a pie chart or a graph or whatever.

It takes TeX perhaps 0.25 seconds to initialize, and next to no time at all to typeset a small formula. MathTran runs TeX as a daemon, to remove the start-up time from the loop. It does something similar for dvipng. As a result, typesetting and rendering to a bitmap takes about 10 milliseconds.

But before we can safely run TeX as a daemon, we have to secure it against unwelcome input. For example, a user might send TeX the string \gdef\alpha{\beta} in an attempt to cause confusion later, possible for other users. MathTran uses a variant secure plain of Knuth's plain TeX format, in which commands such as \gdef are not accessible to the user. In fact, in secure plain, \gdef is an undefined command, and the primitive command \gdef is instead stored as \_gdef. Because of category codes, the ordinary user cannot access \_gdef.

Put more simply, with secure plain the user can access only appropriate commands, and the others are inaccessible.

MathTran is an open source project, coded in Python and hosted on SourceForge. JISC (a UK higher education funding body) and the Open University provided funding.

DW:     Please tell me a bit more about the role of TeX and your role in supporting it at the Open University (and while you're at it, what sort of university the Open University is).

JF:     As I mentioned already, the Open University is the UK's leading provider of distance learning. It was founded in 1969, and gave many people an opportunity they would not otherwise have to take higher education. Many of our students are people who for personal or social reasons were not able to go to college after school. When it started much of the course material was broadcast in the early hours by the BBC, and so the OU became known as the university of the air.

Nowadays, the Internet is widely used to distribute OU learning materials. I'm the TeX expert in LTS (Learning and Teaching Solutions), which is the publishing division of the OU. Initially my work was entirely with print, but now I'm very much involved with web-pages and on-line activities.

Getting mathematical content to work well on web pages is one of our major challenges, and allowing students to author mathematics in forum posts and so forth is another. I'm in the middle of a two year research project on this area, which built on MathTran.

DW:     Finally, what challenges and opportunities do you see for TeX and its communities of users and developers?

JF:     Everyone has their own answer to this, based on what's important to them, and on their own experience and problems. For example, getting math on the web is a major problem for me, whereas for others Unicode text and internationalization are more important.

It's very easy to see one's own problem as being more important than those of others, and I'm very concerned that this focus can be causing division and conflict in our community, and is holding us back. Here's an example. Generating PDF is important to many users, but so are applications based on the original DVI format. I'm in favor of generating PDF and I'm also in favor of generating DVI. But it's so easy for this difference to become a cause of conflict.

The core of TeX, namely TeX the program, Metafont and the Computer Modern fonts, was authored by one very skilled, highly motivated and well supported developer, namely Don Knuth. I don't think we've seen any similar contribution to TeX, either by an individual or a team, since then.

We have seen many valuable incremental contributions, such as the emergence of LaTeX2e, PostScript and PDF support, cross-platform builds for TeX, improved packaging and distribution, the CTAN archive, ConTeXt, international fonts and hyphenation (and apologies for the many omissions in this list).

But TeX was completed in 1992, and frozen by Don Knuth in 1999. Much has happened since then. Computers have become more powerful, the Internet is much more pervasive, Unicode has emerged, and there are powerful and widely-adopted scripting languages (such as Perl, Python and Ruby). In addition, XML has become a standard and there are many lightweight (or wiki) markup languages.

I think our biggest challenge is moving forward, as a growing community of both users and developers, with an extension of TeX (which must of course be given some other name) that responds sensibly to these new challenges and opportunities.

It will be really important, of course, to avoid dividing or otherwise damaging the community, and to avoid what is known as creeping featuritis. Scholarly users of TeX really admire its stability across both time and space.

My main emphasis, which I think comes out in the interview, is:

1. Not replacing TeX the program unless we have to
2. Running TeX as a daemon
3. Running TeX as a web service
4. Improving the programming interface to TeX
I was at the 2008 TUG Conference, and wished that the level of technical communication and debate was higher. I felt, for example, that LaTeX3 and LuaTeX\ConTeXt were ignoring each other. Both projects share problems and a platform, but I'm not aware of significant ongoing discussion or sharing of code involving both these projects. Similarly, XeTeX use an extended DVI format (called xdv), while so far as I know LuaTeX ignores it, and has nothing similar.

It's easy years on to look back on the past as a golden age, but I feel that 25 or so years ago there was an excitement and confidence and energy in the TeX community, similar for example to the more recent emergence of GNU/Linux as a free operating system. I don't see enough of this energy present today, and I think this lack is one of our major challenges.

The hope for the future, as always, must be in the skills and abilities and motivation of the younger generation. Older members of the community, such as myself, have a responsibility to share our experience with them.

The sort of thing I'd like to see happen soon is for TeX, or if necessary a minor extension of TeX, to be incorporated into platforms such as Firefox as a mathematics rendering engine. This could be done in the next two years, given fairly modest resources and sufficient enthusiasm. Such a development would make an enormous change to the use and perception of TeX and related software.

DW:     Thank you very much for taking the time to participate in this interview and for sharing your story and views with me. I can keep up with your activities by reading your blog.

Interview pages regenerated January 26, 2017;