Not signed in (Sign In)

Not signed in

Want to take part in these discussions? Sign in if you have an account, or apply for one below

  • Sign in using OpenID

Site Tag Cloud

2-category 2-category-theory abelian-categories adjoint algebra algebraic algebraic-geometry algebraic-topology analysis analytic-geometry arithmetic arithmetic-geometry book bundles calculus categorical categories category category-theory chern-weil-theory cohesion cohesive-homotopy-type-theory cohomology colimits combinatorics complex complex-geometry computable-mathematics computer-science constructive cosmology deformation-theory descent diagrams differential differential-cohomology differential-equations differential-geometry digraphs duality elliptic-cohomology enriched fibration foundation foundations functional-analysis functor gauge-theory gebra geometric-quantization geometry graph graphs gravity grothendieck group group-theory harmonic-analysis higher higher-algebra higher-category-theory higher-differential-geometry higher-geometry higher-lie-theory higher-topos-theory homological homological-algebra homotopy homotopy-theory homotopy-type-theory index-theory integration integration-theory k-theory lie-theory limits linear linear-algebra locale localization logic mathematics measure-theory modal modal-logic model model-category-theory monad monads monoidal monoidal-category-theory morphism motives motivic-cohomology nforum nlab noncommutative noncommutative-geometry number-theory of operads operator operator-algebra order-theory pages pasting philosophy physics pro-object probability probability-theory quantization quantum quantum-field quantum-field-theory quantum-mechanics quantum-physics quantum-theory question representation representation-theory riemannian-geometry scheme schemes set set-theory sheaf sheaves simplicial space spin-geometry stable-homotopy-theory stack string string-theory superalgebra supergeometry svg symplectic-geometry synthetic-differential-geometry terminology theory topology topos topos-theory tqft type type-theory universal variational-calculus

Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.

Welcome to nForum
If you want to take part in these discussions either sign in now (if you have an account), apply for one now (if you don't).
    • CommentRowNumber1.
    • CommentAuthorUrs
    • CommentTimeOct 13th 2009

    For the first time I am seriously looking into the issue of taking an nLab entry and re-using it in a LaTeX document.

    My first observation is that the automatic tex-output feature is pretty useless for that. Or maybe I am missing something. In any case the code it produces on the page that I am trying always crashes my latex compiler.

    Even without trying to compile the code directly, I see that the engine recognize hyperlinks of the form

    open double square something line something else closed double square

    as something that needs to be converted to "something else".

    When converting the source by hand, everything is pretty straightforward except that it is precisely removing the square brackets and the links from hyperlinks that is tedious.

    Also, the automatic tex-output feature creates a huges preamble trying to emulate all kinds of things, where I could just do with pretty much literally the source itself. All I really needed is:

    • an automatic way to turn the headline hierarchy into a LaTeX section hierarchy

    • keeping of all hyperlinks only the displayed text

    • maybe putting a \mathrm{...} around cosecutive symbols in math environments.

    Does anyone feel similar needs? Or am I just not using the available tex-output feature correctly?

    • CommentRowNumber2.
    • CommentAuthorUrs
    • CommentTimeOct 13th 2009
    This comment is invalid XHTML+MathML+SVG; displaying source. <div> <blockquote> I see that the engine recognize </blockquote> <p>Sorry: I see that the engine DOES NOT recognize</p> </div>
    • CommentRowNumber3.
    • CommentAuthorAndrew Stacey
    • CommentTimeOct 13th 2009

    I use it for making a PDF copy of homeworks on my course installation of Instiki. I find it works alright, but then the documents are self-contained. I do have to go through and edit a few things, though. Wikilinks are one of them. I mentioned this to Jacques once and as well as admitting that the tex output was not optimal, he asked what Wikilinks should produce. If we can come up with a satisfactory answer to that, then it shouldn't be too hard to implement. Personally, I would go for a new command \wikilink[]{} which could then be redefined in the preamble to whatever was wanted.

    There is a little more to the tex output than you say. There's all the other markdown syntax that gets converted (italics, bold-face, etc) so I'd be reluctant to do away with it and would rather improve it.

    For the \mathrm{...}, try using the new \operatorname{...} command in the text. It is annoying that, say, \Poly produces "Poly" in itex but an error in latex, but the \operatorname{...} command means that it gets it right in both cases.

    As for removing the preamble, just do

    cat doc.tex | perl -lne '/\\end{document}/ and $c = 0; $c and print; /\\begin{document}/ and $c = 1'
    

    and that will output everything (but not including) the begin and end document commands.

    • CommentRowNumber4.
    • CommentAuthorUrs
    • CommentTimeOct 13th 2009

    I want the wikilinks to become plain text.

    In the vast majority of cases, at least here on the Lab, a wikilink is addition optional information provided with a keyword, not an integral part of the text. Except at places where we say "see also this link". But in these cases one has to intervene by hand anyway, when going to LaTeX.

    To be concrete, I am currently by hand producing a LaTex equivalent of schreiber: theory of differential nonabelian cohomology. The automatic tex-output for that doesn't seem to help much.

    What i am spending 90 per cent of my time on is really just turning all the hyperlinks into plain text. The rest is converting the syntax for boldface and italics. But that's then allready about it.

    • CommentRowNumber5.
    • CommentAuthorMike Shulman
    • CommentTimeOct 13th 2009

    Surely a regexp search-and-replace could remove the wikilinks quickly? But I agree that the TeX output should turn them into something like \wikilink that could easily be redefined.

    I think it would also make sense for the TeX output to add something like \newcommand{\Poly}{\operatorname{Poly}} to the preamble for every command like \Poly that's used in the text but isn't defined in the usual packages. I'm surprised that the existing TeX output doesn't at least put mathrm or operatorname around consecutive symbols, given the notable difference between how they are treated by itex and by tex.

    • CommentRowNumber6.
    • CommentAuthorAndrew Stacey
    • CommentTimeOct 13th 2009

    To save you some grief (in case your need is more urgent than our usually leisurely discussions allow), what you could do is get the source of the page and then convert it to tex using maruku locally. That would enable you to add in a first step to deal with wikilinks and the like before maruku gets hold of it and converts the rest to LaTeX. It won't be hard to come up with a script that removes all wikilinks and converts '\dontknow' to '\operatorname{dontknow}'. The first of these is certainly easier to do before the conversion than afterwards.

    Incidentally, the conversion to tex is done by maruku. I'm not sure whether or not iTeX even gets called when it does the conversion so what happens inside mathematics environments may be tricky to work with.

    • CommentRowNumber7.
    • CommentAuthorUrs
    • CommentTimeOct 13th 2009

    It's semi-urgent not for me but for one of my collaborators, but the amount of conversion we need to do can be done by hand.

    But I just tried to compile the tex code produced on random pages on the nLab, also on some with comparatively trivial content (layout-wise). My LaTeX compiler hangs on each and every one of them and needs to be killed by hand.

    None of you have that problem?

    • CommentRowNumber8.
    • CommentAuthorUrs
    • CommentTimeOct 13th 2009

    I checked with Hisham Sati, asked him to try to compile the TeX output of a random page from the Lab on his machine. He says it doesn't compile at all for him, either, producing just a huge list of error messages.

    I think he probably tried this one on my personal web. I'd be interested in hearing if anyone else has any luck in compiling that one.

    • CommentRowNumber9.
    • CommentAuthorMike Shulman
    • CommentTimeOct 13th 2009

    The TeX of that page compiles for me with the following changes:

    1. Comment out the \renewcommand{\empty}{\emptyset}

    2. Change all the \array{...} commands to \begin{array}{ccc}...\end{array} where there are enough "c"s to handle as many columns as that particular array environment uses.

    • CommentRowNumber10.
    • CommentAuthorAndrew Stacey
    • CommentTimeOct 13th 2009

    If you look in the preamble then you'll see that \array is one of the unresolved issues with the tex output.

    • CommentRowNumber11.
    • CommentAuthorUrs
    • CommentTimeOct 13th 2009

    I see, thanks.

    How hard would it be to make the software take care of these two issues?

    • CommentRowNumber12.
    • CommentAuthorAndrew Stacey
    • CommentTimeOct 13th 2009

    I guess the first thing is to figure out what it should be. I just tried it with:

    \newcommand{\itexarray}[1]{\begin{matrix}#1\end{matrix}}
    

    and search-and-replace on \array{} to make it \itexarray{} (renewcommanding \array{} doesn't work). You should look at what it looks like to see if that's acceptable. There may be a better choice.

    I've no idea how easy this would be to implement. That's not my department.

    I don't get any errors for \empty, by the way. What's the issue there?

    • CommentRowNumber13.
    • CommentAuthorUrs
    • CommentTimeOct 13th 2009

    Oh, I see. Then I'll think about contacting Jacques about this. Would be a pity if all pages with an array on them would fail to TeXify, just because of that.

    • CommentRowNumber14.
    • CommentAuthorMike Shulman
    • CommentTimeOct 13th 2009

    I don't know what the issue is with \empty. The error it causes is "Missing $ inserted" at the location of \begin{document}, and lots of other similar ones later on. I assumed that LaTeX uses a command called \empty internally for another purpose, although I didn't track it down. This conjecture is supported by the fact that the \empty is \renewcommand'd rather than \newcommand.

    • CommentRowNumber15.
    • CommentAuthorTobyBartels
    • CommentTimeOct 13th 2009

    I assumed that LaTeX uses a command called \empty internally for another purpose

    It does; I believe that it's an empty hbox. (I find it annoying that iTeX uses \empty for that reason; I use \nothing in my own documents, with the purpose of defining it to be \emptyset or \varnothing as I choose.)

    • CommentRowNumber16.
    • CommentAuthorUrs
    • CommentTimeOct 3rd 2010

    I am trying again, and again having trouble with turning nLab pages into LaTeX code that does compile on my machine. I’d like to ask you for help or comments that might help.

    Some of the problems that I had when I first started this thread were solved.

    For instance I realized that

    • where I thought my LaTeX compiler crashed when compiling an Lab page, it just does come back after a long period.

    • I needed to switch to a modern compiler such as xelatex (thanks to Jacques Distler for this information) that could handle the extra unicode characters (such as the omnipresent infinity-signs).

    But I am still experiencing problems. Could somebody do me a favor and check if you can compile the LaTeX code that is produced for instance for the entry

    Thanks!

    When I try to compile the code that the nLab produces after hitting the button “TeX” on that entry, I get the error message

      Use of \@array doesn't match its definition
    

    first on line 408, and then a huge list of further error messages after that.

    • CommentRowNumber17.
    • CommentAuthorUrs
    • CommentTimeOct 4th 2010

    Finally it penetrated my thick skull that the situation is as follows:

    for some reason the

      \array{  ...  }
    

    environment that we have been using pretty much exclusively is not as much supported by auxiliary routines as other environments, that achieve the same functionality (as long as no extra options are used).

    Apparently instead we should be using

       \begin{matrix} .... \end{matrix}.
    

    That environment is supposed to be supported by the Wiki-to-LaTeX functionality.

    • CommentRowNumber18.
    • CommentAuthorUrs
    • CommentTimeOct 4th 2010
    • (edited Oct 4th 2010)

    I tried my luck on double category:

    I changed the original array-environment there into a matrix environment. That now works.

    But my xelatex compiler still complains about a “runaway argument” in line 433. I don’t see yet what causes this.

    Does anyone?

    • CommentRowNumber19.
    • CommentAuthorEric
    • CommentTimeOct 4th 2010
    • (edited Oct 4th 2010)

    If that is the case, we should probably make a prominent note somewhere highly suggesting that everyone use the matrix environment.

    How about “aligned”? That is what I often use.

    • CommentRowNumber20.
    • CommentAuthorUrs
    • CommentTimeOct 4th 2010

    If that is the case, we should probably make a prominent note somewhere highly suggesting that everyone use the matrix environment.

    Yes. And we need to find a way to remove all the existing array-environments.

    How about “aligned”? That is what I often use.

    But does this accept multiple aligned rows? I always thought it accepts just one ampersand delimiter per line. No?

    Could somebody tell me if the entry double category produces compilable LaTeX on your machine? It does not on mine. Jacques says it does on his. I would like to understand what the system requirements are.

    • CommentRowNumber21.
    • CommentAuthorEric
    • CommentTimeOct 4th 2010
    • (edited Oct 4th 2010)

    To be honest, I’m not sure what the difference between matrix and aligned environments are. For everything I’ve ever done on the nLab, including multiple ampersands per line, aligned was sufficient though.

    My LaTeXing days are long gone so I can’t help you with compiling issues without installing LaTeX, etc.

    Out of curiosity, which LaTeX are you using? Editors, etc? I may try to reproduce what you have. (Edit: I suppose operating system could be relevant too.)

    • CommentRowNumber22.
    • CommentAuthorUrs
    • CommentTimeOct 4th 2010

    which LaTeX are you using? Editors, etc?

    Just yesterday I downloaded the latest MikTeX distribution (in order to get the new XeLaTeX, which is able to handle the unicode symbols such as the inifnity-signs on our pages).

    I am using this on WinNT.

    • CommentRowNumber23.
    • CommentAuthorAndrew Stacey
    • CommentTimeOct 4th 2010

    I’m getting different behaviour depending on whether I compile it via the command line or via Emacs. It works via Emacs but on the commandline then I get complaints about unicode characters. So I’m guessing that Emacs correctly sets some encoding before it calls xelatex which isn’t set in my usual command line. So it is all down to encoding.

    • CommentRowNumber24.
    • CommentAuthorMike Shulman
    • CommentTimeOct 4th 2010

    My latex and pdflatex both handle it fine, via Emacs and also on the command line. This is the default TeX installation on Ubuntu Lucid.

    • CommentRowNumber25.
    • CommentAuthorFinnLawler
    • CommentTimeOct 4th 2010

    My pdflatex (TeXLive on Debian Etch) works fine too, from Emacs and the shell.

    • CommentRowNumber26.
    • CommentAuthorTobyBartels
    • CommentTimeOct 4th 2010

    I had no trouble compiling; my system should be the same as Mike’s (except that I need to upgrade my operation system).

    I notice that the compiled LaTeX doesn’t handle [[!includes]] and [[!redirects]] properly.

    • CommentRowNumber27.
    • CommentAuthorUrs
    • CommentTimeOct 4th 2010

    Thanks a lot to you all for going through the trouble of checking this for me! I really appreciate it.

    So I will fiddle a bit more with my LaTeX installation then. Thanks again.

    • CommentRowNumber28.
    • CommentAuthorzskoda
    • CommentTimeOct 5th 2010
    • (edited Oct 5th 2010)

    Amazing. Pdflatex commandline on my ubuntu works for TeX output of some entries. Even some letters with diacritics like Č,č,ć, ž seem to work, though đ does not. Afterdownloading I can modify content with emacs launched from command line without being asked to manually set encodings.

    This whole thing with LaTeX almost exportable makes now codecogs more attractive than SVG for diagrams. As then we have sourcecode, which is easy to modify and carry and transfer into LaTeX. But codecogs site gives a free downloadable standalone application which does the main thing. So we may think in future using it from some serve. I am afraid depending on some third party site which may stop working or may charge in future or may limit the amount of displays (currently they allow 5 thousand formula displays per day per site. nlab is becoming a big site and if we start using codecogs regularly it may be a problem.

    EDIT: BTW, I never solved the problem of xhtml vs html on firefox. I mean my own copy of nlab html obtained using wget command (the script written by Andrew) renders incorrectly in firefox unless I rename the orig file into xhtml (otherwise it does not try xml). Renaming html does not work as it work for orig html. If I press on link, I will then of course use the links to .html extensions so the renamed files will not be found. Of course, I have no problem with rendering saved pages which were originally manually downloaded using firefox. What am I missing ?

    • CommentRowNumber29.
    • CommentAuthorzskoda
    • CommentTimeOct 5th 2010

    Maybe we should write a separate page with various advice on using LaTeX export (with link to this discussion as well). Maybe it exists already so I am not going to create until we find where is its proper place in meta lab.

    • CommentRowNumber30.
    • CommentAuthorzskoda
    • CommentTimeOct 5th 2010

    On the other hand, XeTeX comandline in ubuntu does not complain to đ, č, ć, š, in text, but all of them disappear in the output. Pdflatex had rendered č,ć,š, complained about đ, and after force compilation did not render đ, but rendered č.

    • CommentRowNumber31.
    • CommentAuthorUrs
    • CommentTimeOct 5th 2010
    • (edited Oct 5th 2010)

    I just wasted about 4 hours of the night trying to get this to work on my machine. But it still won’t.

    Zoran, could you be so kind and show me – send me by email – the pdf-result that you obtain by doing LaTeX export on the entry Lie infinity-groupoid? I would like to see if it is worth the trouble for me to get to this point or if I should try another strategy.

    Thanks!

    • CommentRowNumber32.
    • CommentAuthorAndrew Stacey
    • CommentTimeOct 5th 2010

    This whole thing with LaTeX almost exportable makes now codecogs more attractive than SVG for diagrams.

    au contraire, Zoran, SVG is now more attractive than ever before thanks to the indefatigable Jacques. It is now possible to export the SVGs to PDFs. For more details, see this blog post. And PDFs are like SVGs: scalable. So you can now draw your diagrams merrily with the inbuilt SVG editor, including maths as you will, to create fantastic diagrams - far, far better than that produced by codecogs - safe in the knowledge that they can be converted to PDFs on export.

    • CommentRowNumber33.
    • CommentAuthorAndrew Stacey
    • CommentTimeOct 5th 2010
    • (edited Oct 5th 2010)

    Urs, getting unicode and LaTeX to work is a bit of a Dark Art, but it is well worth it in general, let alone with working with nLab exports. Unicode solves all the hassles with different encodings and different characters. Unfortunately, TeX was written before unicode came on the scene so support for unicode has to be built on top. There are two ways, either using ordinary latex with the \inputenc package, or using xelatex. I’m not an expert on this and am not sure that I’m doing it right (as witnessed by the fact that I get different behaviour in what ought to be identical circumstances). I recommend that you ask about this on http://tex.stackexchange.com since I know that there are some experts there.

    • CommentRowNumber34.
    • CommentAuthorUrs
    • CommentTimeOct 5th 2010

    Andrew,

    thanks for the hints. And thanks for all your help!

    But, myself, I can’t afford, time-wise, to follow this up any further. Apart from the lengthy email discussion we Jacque we had, I spent a good bit of the last night trying to fiddle with, reinstalling my entire LaTeX system etc. Then I had trouble getting it even back to the point where it did what it used to do, not to mention compile anything that is produced by the nLab. There is a tower of other things that I need to do, I just can’t spend more time on fixing software. I am in a huge stress, time-wise, and the idea was that the automatic LaTeX functionality of the Lab would help me save time. But it is the other way round. And it is not even clear to me that once the pages do at least compile on my system, they will compile to something that I can actually use.

    So thanks for all your efforts, but I will have to put this issue aside for a bit.

    Zoran,

    in case you do feel inclined to follow my little request (maybe not necessary anymore) wait until I have changed at Lie infinity-groupoid all occurences of the \array-command by the \matrix-command!

    • CommentRowNumber35.
    • CommentAuthorzskoda
    • CommentTimeOct 5th 2010

    Andrew, I use LaTeX because I have human-readable and adaptable source code. With xypic I can still use small dvi files, I can change easily source code etc. offline. I can post the source code to the arxiv, and will have no problem with compilation. So far I did not see much of the “fantastic diagrams” in SVG. In xypic the style of arrows and so on is fixed and neat. This is limited, but this is what I like, standard clear readable proportions for mathematics graphics. With SVG there is in principle more possibilities and less norm, everybody uses their own style and most look thick and ugly to my taste and if a diagram is big, I have harder time to find my way through understanding it. With being able to export iTeX with markdown it is now advantageous to me to have codecogs functioning xypic which can be manually (I may write a script filter one day) reformatted to the usual xypic formula.

    Urs, I will compile and send the pdf (in the afternoon). At the moment, I have something to run for first.

    Does anybody knows if the xelatex compatible files with unicode characters can compile as arxiv submissions ? (I may try to email Ginsparg directly if we know what we want and is easily fixable).

    • CommentRowNumber36.
    • CommentAuthorTobyBartels
    • CommentTimeOct 5th 2010

    I am afraid depending on some third party site which may stop working or may charge in future or may limit the amount of displays

    For future diagrams, we really depend on them; for current diagrams, we depend on them only because we are lazy.

    We really ought to upload each image file to the nLab and use that copy.

    • CommentRowNumber37.
    • CommentAuthorzskoda
    • CommentTimeOct 5th 2010
    • (edited Oct 5th 2010)

    We really ought to upload each image file to the nLab and use that copy

    If we do that, we will remove the source code from the appropriate page by replacing them with the link to the image file. This is counterproductive step in my opinion. I value much the existence and (at least manual) exportability of source xypic code.

    It would be better just to have some backup image file, just in case and for use offline. How to organize such a double standard with present system I do not see. Some sort of case-of loop (in the sense of programming languages), using local copy when available, and retaining a reserve case with source code, would be closer to an ideal solution.

    • CommentRowNumber38.
    • CommentAuthorAndrew Stacey
    • CommentTimeOct 5th 2010

    In xypic the style of arrows and so on is fixed and neat. This is limited, but this is what I like, standard clear readable proportions for mathematics graphics. With SVG there is in principle more possibilities and less norm, everybody uses their own style and most look thick and ugly to my taste and if a diagram is big, I have harder time to find my way through understanding it.

    I personally loathe the way that xypic diagrams look (okay, that’s a bit strong) and I don’t use it anymore. I use TikZ! It’s much better - looks better, easier to write, and all that jazz. So saying that xypic means everything looks the same may be right, but only if everyone uses xypic, which they don’t.

    And SVG is ultimately stylable. Don’t like how my diagrams look? Fine, add a bit of CSS over the top and you can make them appear how you like. For example, I’ve been drawing some pictures of knots and links. To distinguish the links, I use colours. But what if I use red and green and someone looks at them who is red-green colourblind. With inline images, that’s their hard luck. With SVG, it’s a line of CSS to change that green to blue.

    I think you should look at Jacques’ latest paper to see what’s possible with SVG. Try doing that with xypic.

    But I don’t mean to sound unsympathetic. I can understand the desire not to have that huge SVG mess appearing in the source. There is already a way to have SVGs on the web and inline images in the source: use the \begin{svg} .. \end{svg} and \includegraphics commands from iTeX. It may be possible to persuade Jacques to make \xymatrix a non-operation in iTeX so that it’s possible to have a web-version of the graphic as an SVG and have the xypic code appear in the source. I think that’s a reasonable feature request to make - at worst, he can only say “No”!

    • CommentRowNumber39.
    • CommentAuthorzskoda
    • CommentTimeOct 5th 2010
    • (edited Oct 5th 2010)
    > I think you should look at Jacques' latest paper to see what's possible with SVG. Try doing that with xypic.

    Why would I care ? Microsoft philosophy is of every-day updates and need for new versions. I like on the contrary when I find that something works for me that I do not need to change it ever again. For formulas in LaTeX pretty much all I routinely need is there. When I used old 286 computer in mid 1990s I could compile LaTeX files and when I would send return it would go to compile right now. With new ubuntu (which is over 50 % microsoftised) I can hit return and sometimes the window would faint to grey and some background process would start making me in such moments compile slowlier on new 2007 laptop than on 1990 286 machine. Why ? because somebody thinks I like that some background process takes over for needs of various features I never use.

    Edit: it is also important to me when I visit a colleague at another university who did not tune the fancy software to my needs, that all that basic stuff works pretty much the same there without talking to a system manager.

    > There is already a way to have SVGs on the web and inline images in the source

    I like the eventual possibility of having both source code and SVG (or other generated or saved end-formats). On the other hand, it is true, if I would to do a page with SVG for my own purposes I would inline include another page which would have just the nonhuman svg code. Say page dinaturality pointing to big diagram for dinaturality (nonhuman).
    • CommentRowNumber40.
    • CommentAuthorTobyBartels
    • CommentTimeOct 5th 2010

    We really ought to upload each image file to the nLab and use that copy

    If we do that, we will remove the source code from the appropriate page

    I put the source code also into the alt text, and we can continue to do that. If we’re not lazy, that is.

    • CommentRowNumber41.
    • CommentAuthorzskoda
    • CommentTimeOct 5th 2010
    Is there a sample page ?
    • CommentRowNumber42.
    • CommentAuthorMike Shulman
    • CommentTimeOct 5th 2010

    Having to store separately the source code and a generated SVG/PDF/image file, and manually update the latter instead of having it automatically generated from the source code, seems to me to make the author do too much work and the computer too little.

    I haven’t tried the instiki SVG editor recently (the last time I tried it it wasn’t really working for me), but in general I get much better results with xypic and tikz than I do with any wysiwyg editor.

    • CommentRowNumber43.
    • CommentAuthorzskoda
    • CommentTimeMay 27th 2012
    • (edited May 27th 2012)

    Have you guys seen LaTeX editor and compilation as an application on top of google docs ?

    http://docs.latexlab.org

    This means, in particular, that one can be on quite a poor machine without LaTeX installed and do some LaTeX functionality within googledocs, one can write, compile and view, and then, export pdf if needed. I do not know how good is with various advanced things like hyperTeX etc.

  1. I tried it: It looks like hyperlinks and links inside the document are not recognized, also the google compiler seems to have problems with the preamble of the output of the instiki-export source code.

    • CommentRowNumber45.
    • CommentAuthorzskoda
    • CommentTimeMay 28th 2012

    Regarding that their “latexlab” is still in development maybe it would be good to locate the problem and warn them. maybe it is a part of a bigger problem which may be interesting to developers (well, wishful thinking maybe).