Worst planned trip ever.
That is what my England trip for the GNU Tools Cauldron was, but that only seemed to add to the pleasure of meeting friends again. I flewin to Heathrow and started on an almost long train journey to Halifax,with two train changes from Reading. I forgot my phone on the trainbut the friendly station manager at Halifax helped track it down andgot it back to me. That was the first of the many times I forgotstuff in a variety of places during this trip. Like I discovered thatI forgot to carry a jacket or an umbrella. Or shorts. Or full lengthpants for that matter. Like I purchased an umbrella from Sainsbury’s but forgot to carry it out. I guess you got the drift of it.
All that mess aside, the conference itself was wonderful as usual. My main point of interest at the Cauldron this time was to try and make progress on discussions around multi-arch support for ARMv8. I have never talked about this in my blog the past, so a brief introduction is in order.
What is multi-arch?
Processors evolve over time and introduce features that can be exploited by the C library to do work faster, like using the vectori SIMD unit to do memory copies and manipulation faster. However, this is at odds with the goal of the C library to be able to run on all hardware, including those that may not have a vector unit or may not have that specific type of vector unit (e.g. have SSE4 but not AVX512 on x86). To solve this problem, we exploit the concept of PLT and dynamic linking.
I thought we were talking about multiarch, what’s a PLT now?
When a program calls a function in a library that it links to dynamically (i.e. only the reference of the library and the function are present in the binary, not the function implementation), it makes the call via an indirect reference (aka a trampoline) within thebinary because it cannot know where the function entry point in another library resides in memory. The trampoline uses a table (called the Procedure Linkage Table, PLT for short) to then jump to the final location, which is the entry point of the function.
In the beginning, the entry point is set as a function in the dynamic linker (lets call it the resolver function), which then looks for the function name in libraries that the program links to and then updates the table with the result. The dynamic linker resolver function can do more than just look for the exact function name in the libraries the function links to and that is where the concept of Indirect Functions or IFUNCs come into the picture.
Further down the rabbit hole - what’s an IFUNC?
When the resolver function finds the function symbol in a library, it looks at the type of the function before simply patching the PLT with its address. If it finds that the function is an IFUNC type (lets call it the IFUNC resolver), it knows that executing that function will give the actual address of the function it should patch into the PLT. This is a very powerful idea because it now allows us to have multiple implementations of the same function built into the library for different features and then have the IFUNC resolver study its execution environment and return the address of the most appropriate function. This is fundamentally how multiarch is implemented in glibc, where we have multiple implementations of functions like memcpy, each utilizing different features, like AVX, AVX2, SSE4 and so on. The IFUNC resolver for memcpy then queries the CPU to find the features it supports and then returns the address of the implementation best suited to the processor.
… and we’re back! Multi-arch for ARMv8
ARMv8 has been making good progress in terms of adoption and it is clear that ARM servers are going to form a significant portion of datacenters of the future. That said, major vendors of such servers with architecture licenses are trying to differentiate by innovating onthe microarchitecture level. This means that a sequence of instructions may not necessarily have the same execution cost on all processors. This gives an opportunity for vendors to write optimal code sequences for key function implementations (string functions for example) for their processors and have them included in the C library. They can use the IFUNC mechanism to then identify their processors and then launch the routine best suited for their processor implementation.
This is all great, except that they can’t identify their processors reliably with the current state of the kernel and glibc. The way to identify a vendor processor is to read the MIDR_EL1 and REVIDR_EL1 registers using the MSR instruction. As the register name suggests, they are readable only in exception level 1, i.e. by the kernel, which makes it impossible for glibc to directly read this, unlike on Intel processors where the CPUID instruction is executable in userspace and is sufficient to identify the processor and its features.
… and this is only the beginning of the problem. ARM processors have a very interesting (and hence painful) feature called big.LITTLE, which allows for different processor configurations on a single die. Even if we have a way to read te two registers, you could end up reading the MIDR_EL1 from one CPU and REVIDR_EL1 from another, so you need a way to ensure that both values are read from the same core.
This led to the initial proposal for kernel support to expose the information in a sysfs directory structure in addition to a trap into the kernel for the MRS instruction. This meant that for any IFUNC implementation to find out the vendor IDs of the cores on the system, it would have to traverse a whole directory structure, which is not the most optimal thing to do in an IFUNC, even if it happens only once in the lifetime of a process. As a result, we wanted to look for a better alternative.
The number of system calls in a directory traversal would be staggering for, say, a 128 core processor and things will undoubtedly get worse as we scale. Another way for the kernel to share this (mostly static) information with userspace is via a VDSO, with an opaque structure in userspace pages in the vdso and helper functionsto traverse that structure. This however (or FS traversal for that matter) exposed a deeper problem, the extent of things we can do in an IFUNC.
An IFUNC runs very early in a dynamically linked program and even earlier in a statically linked program. As a result, there is very little that it can do because most of the complex features are not even initialized at that point. What’s more, the things you can do in a dynamic program are different from the things you can do in a static program (pretty much nothing right now in the latter), so that’s an inconsistency that is hard to reconcile. This makes the IFUNC resolvers very limited in their power and applicability, at least in their current state.
What were we talking about again?
The brief introduction turned out to be not so brief after all, but I hope it was clear. All of this fine analysis was done by Szabolcs Nagy from ARM when we talked about multi-arch first and the conclusion was that we needed to fix and enhance IFUNC support first if we had any hope of doing micro-architecture detection for ARM. However, there is another way for now…
A (not so) famous person (me) once said that glibc tunables are the answer to all problems including world hunger and of course, the ARMv8 multi-arch problem. This was a long term idea I had shared at the Linaro Connect in Bangkok earlier this year, but it looks like it might become a reality sooner. What’s more, it seems like Intel is looking for something like that as well, so I am not alone in making this potentially insane suggestion.
The basic idea here would be to have environment variable(s) todo/override IFUNC selection via tunables until the multi-arch situation is resolved. Tunables initialization is much more lightweight and only really relies on what the kernel provides on the stackand in the auxilliary vector and what the CPU provides directly. It seems easier to delay IFUNC resolution at least until tunables are initialized and then look harder at how much further they can be delayed so that they can use other things like VDSO and/or files.
So here is yet another idea that has culminated into a “just finish tunables already!” suggestion. The glibc community has agreed on setting the 2.25 release as the deadline to get this support in, so hopefully we will see some real code in this time.
The pretext was Nisha’s cousin’s wedding in Bangalore. We were already high from our wonderful wildlife experience in Thailand and when the chance to travel to Bangalore came, we were in no doubt that a safari in one of Karnataka’s national parks will grace one of the weekends. It did not take us long to settle on the national park - we were going to Bandipur! I booked us in Bandipur Safari Lodge for 3 nights, which gave us 6 safaris to look for wildlife. A tiger would be amazing but I was more interested in spotting leopards and sloth bears.
We landed at the Bangalore airport early in the morning and waited for our Zoomcar to arrive. I found out at the airport that the car I had booked had an accident the previous night so I was getting a Ford Figo instead, not the start I was looking for. In any case, we picked up our car and drove on to Gundlupet. The drive was thankfully uneventful and we reached the lodge just in time for our evening safari. I informed the staff that our 2 year old would be accompanying us for the safari and he was not very happy. He warned us that if she got scared or bored or cried, there was no way to return before the end of the safari. I assured him that our kid was an angel and he let us board.
Ira was actually quite amazing on the safari, especially for a 2 year old. She excitedly looked for animals and birds (she already has the ability to spot birds somehow!) and shouted out when she saw something interesting. Therein lied the problem unfortunately. She was often too excited and that was a bit disturbing for other occupants of the jeep. She was not very disruptive though and she fell asleep for the last third of the trip.
It rained for a while during the safari and that freshened the forest up a bit. The light looked divine and I was really excited about seeing an exotic animal or bird at that point. We saw some deer, Gaur peacock and langur, a mongooseon one side, a black-naped hare on the other. No tigers, no leopards, no sloth bears. I did not miss them either, because the entire experience was just wonderful - the lighting was great, the weather was pleasant and the birds and animals that we did see looked beautiful.
For the remaining trip we decided to alternate safaris to avoid disturbing our fellow holidayers. Nisha would do the safari next morning, I would do the following evening and morning and then Nisha would do the last one in the evening. We had already decided to skip the Monday morning safari in the interest of getting to Bangalore in time on Monday.
Nisha’s morning safari was a success - she saw a big male leopard ambling across about a 100 meters or so from her jeep. In her excitement she forgot to zoom into the cat and managed to get some interesting habitat shots instead. Either way, we had our first sighting! I was excited at the prospect of seeing the leopard that evening. Something else was in store for me though.
My evening safari jeep had three families with young children and at first I did not think much of it. Once we entered the forest however, some of the children and adults were quite annoyingand were constantly making noise. There were discussions of cricket as we trudged along and a lot of the shushing from the naturalist went unheeded. The most annoying bit was when we waited for a leopard to cross our track and at that precise moment one of the kids wanted to go pee. A parent stood up and demanded that the driver take them to a place where she can pee. As it turned out, the leopard did cross that path and later that night, also brought a kill to that spot. Such was my luck that evening. Despite that, I did manage to get my first sighting of a Crested Serpent Eagle, so the evening was not completely wasted.
I requested that I be put on a different jeep the next day and that set me up for the most memorable safari yet. No, I did not see a tiger, nor a leopard nor a sloth bear.
We saw elephants, but that was not the highlight either, even though it was really exciting.
I saw the first Indian Nightjar of my life! I had never imagined seeing a nightjar in my lifetime because I consider myself an average (or maybe a bit below) birder. Thanks to the wonderful company I had in the jeep, we were able to spot the beauty just as it flew from the front of our jeep to a tree nearby. But then, even if I had not seen a nightjar, this would have been the best safari of the trip because of the company I had. In addition to a good driver (but then all of the driver/guides at the safari lodge are terrific) we had a couple of keen wildlife photographers who were great at spotting and tracking and best of all, nobody was talking, let alone about cricket. I now wanted to do the Monday morning safari too and not give it up. I spent the afternoon trying to convince Nisha.
It was again Nisha’s turn and as expected, she came back with dozens of shots of a popular male tiger called Prince. The big male was lazing in a waterhole and all of the jeeps had converged on him, everyone firing away furiously on their cameras. Since I had not seen any big cats, Nisha let me do the Monday safari.
We started the morning looking for tiger tracks. We saw tracks of a male in the area that Prince was seen the previous evening and were following it. The driver got a phone call and was told of a sighting in a different sector of the forest. He apologized to us and started speeding away to the other section of the forest. We held on to our dear lives!
After about 15 minutes of a very bumpy ride, we reached a spot where a couple of other jeeps were already waiting. After another 10 minutes or so, the big male crossed over, just beyond our sight! Our driver made a desperate last attempt to drive closer so that we could get one shot, but it was too late. He was not one to give up though and quickly guessed that the tiger was headed to the waterhole nearby. We sped to that place and waited. In no time, we saw the huge male amble down to the edge of the pond. He took a drink and then hind legs first, settled into the mossy water. The Basavanagatta male (that is what he was called, although I am sure the spelling is grossly wrong) stayed there for a long time, glancing at us now and then. He was a little over 100 meters away from us, so he did not have any reason to feel nervous. He finally got bored of sitting in there and swam over to the other end of the pool and walked off.
Our driver instantly knew which way he was going and started driving around to the other side of the huge thicket. We waited there and finally the cat stepped out from the thicket. The huge cat looked at us, gave a snarl and ambled into another thicket. This time he was not more than 20 meters away. We spent some more time waiting to see if it would come out from another side of the thicket, but he did not, or maybe he escaped from some other spot, we don’t know. What I did know was that I wanted to do another safari!
I had run out of time though, so we had to check out and drive back to Bangalore. The drive back had a hint of melancholy as both of us wanted to stay longer. The lodge itself did not exactly ooze luxury (it was quite basic) but the people were warm and the forests were enchanting. Stories of people staying there for weeks at a time did not help as I wanted to do that too. Maybe some day I will go there without prior plan to return…
The Whole Story
I could read the tiny sign on the right of the line of letters that said 10⁄10. A similar tiny sign on the left said 6⁄6. The letters in the middle were also more or less perfect except for the B which seemed a bit muddy.
“Congratulations, you’re testing 6⁄6!”, she said.
“It is sharp but there is a slight haze, like when I get oily fingers on my glasses and try to wipe it off.”
“That will go away, don’t worry :)”
“And what’s with the dilating eye drops every night? They waste my morning because I see halos for almost the entire morning until their effect wears off.”
“Those are to relax your focussing muscles and help you heal faster. They’re only for 5 days, so you don’t have to live with that forever.”
And there you are, the end of a life changing episode that began under a month ago. I know I had promised to write in ‘live’, but the sequence of events went such that I did not have any time until today. It is not that late though, my left lens was implanted on Tuesday, the 29th of March and the right lens on Wednesday the 30th. So consider this deferred live :)
My original appointment was on 21st to repeat the iridotomy in my right eye since even lasers were unable to pierce my eye of steel! That appointment was honoured and I had my iridotomy just like I had ordered, a little less painful than the last time. The bad news though was that my lenses had not arrived and hence we could not do the implants on 22nd and 23rd as planned. The lenses finally arrived over the weekend and we narrowed down on 29th and 30th for the surgeries.
The Left Eye
We started out early on Tuesday. I was strangely not very nervous, just wondering how it would be like without glasses. We found out on the drive to the hospital that theneeded me to give them a blood sample before the surgery to screen for HIV, Diabetes and some other common conditions. It should have been done previously but they failed to notify us and it meant a delay of a couple of hours for us. We were slightly annoyed but I am patient with such things - we’re humans and minor oversight is OK as long as it does not have serious consequences.
We reached and I gave my blood sample and the nurse put eye dilating drops into my left eye and the wait began. I had to watch people go out ahead of me as my blood sample was being tested but it was OK since I had Nisha to harrass with my silly jokes and theories. When they were finally ready for me, I was led into the OT along with another old lady who was about to have her cataract surgery.
We were greeted in the pre-Op room by a horrific sight. An old lady was lying prone on a bed and an anaesthetist was piercing a long needle into her eye as she screamed about how it hurt. In the room was another bed and a couple of chairs where more people sat and watched the scene in horror. One of the girls seemed to be weeping.
I wasn’t sure what to make of it. I am not a great fan of poking things into my eyes and seeing that definitely made my stomach churn. The anaesthetist calmed us by telling us that we did not have to go through that and topical anaesthesia was sufficient. Big relief, but it would have been better if we were not treated to that sight.
The rest of the wait was relatively uneventful and my turn was the last because they had to mark the axis of my toric lens in my eye. This involved putting a clamp to hold my eye open and then marking the axis with a device smeared with marker ink. High tech stuff!
Once the marking was in place, I was led to the OT bed and I was in an inexplicably chatty mood, asking silly questions to the doctor. The doc asked me to shut up and in hindsight, I realize that I may have been very nervous. As the operation progressed, he told me whenever he was doing something important, like inserting the lens or cleaning the eye or adjusting the lens. I felt some pressure throughout the operation, but no real pain. At the end, they gave me a pair of tacky eye shield glasses and led me out into the pre-Op room.
I had expected to be able to see from my left eye walking out of the operating room, but that did not happen because there was a light shining into my eye for the entire duration of the operation. Within minutes however the glare cleared and by the time I met Nisha outside the OT, I could see her clearly! We then spent a couple of hours with me chattering away in excitement, relating the anaesthesia story to her and her coaxing me to take a short nap. There were eye drops to be administered every 15 minutes so there wasn’t much chance of me sleeping anyway. The doc checked my eye on the way out, declared it to be ‘perfect’ and told me to administer the drops regularly so that we can implant the lens into my other eye the next day.
I spent the evening napping and drowning my eye in drops, eagerly anticipating the operation the following day and the resultant clear vision.
The Right Eye
I got up the next day and found that while the vision in my left eye was sharp, it was hazy, like when one smears oil on ones glasses and tries in vain to clean them with a cloth and use them. I mentioned that to the doc when we reached and he said that will clear in a few days. To his credit, it cleared by evening. My left eye inflammation had reduced significantly so he gave a go-ahead for the right eye implant on the same day. So I was back upstairs to flood my right eye with dilating drops. The drill right up to the operation was the same as the left eye (without the anaesthesia scare this time) and soon enough it was my turn to be operated on.
This time though, the procedure hurt a bit more than it did for my left eye. I mentioned it to the doc and he said we were almost done. Sure enough, we were and after it was done, I could see clearly! The human brain seems to have amazing resolution and it seems able to take two images at different exposures and produce one with acceptable resolution and exposure. I was very happy stepping out into the pre-Op area and finally out into the recovery area. In the recovery area I was even more incapable of relaxing compared to yesterday because now I had an almost perfect set of eyes to experiment with. The result was that I was much more tired by the end of the day and was glad to hit the sack.
The test the following day showed that my lenses were perfectly measured and I could have perfect vision once the eye had healed perfectly. Now my next visit is in about a week. I will probably not write about it unless there is something interesting to share. I have already begun hunting for my ghost glasses only to grab the side of my face. The biggest difference however is that there is no longer a shield between my eyes and the air outside. My eyes can feel the air freely and it is really unnnerving! It will take me a while to get used to.
Now off to rest my newly acquired eyes…
The Whole Story
The fact that I can write this means that I have not lost my eyesight after being punched by a laser! There are a lot of things to be aware of though, so let me start from the beginning.
The day started with the realization that mom had an appointment with her doctor and we would have to take Ira along with us. The counsellor at Vasan told me that I could drive in so I was not very concerned about the procedure despite the scary stories online. Nisha however was not taking any chances and we ended up taking a cab. In hindsight, that was a great decision.
We reached right on time and my eyes were flooded with drops the moment I sat in the waiting area. The nurse topped up the drops some 3-4 times and through the hour and half of waiting, all I could do was listen to Nisha chasing Ira around as the little monster made the hospital her playground. After a little less than an hour a dull headache began to creep in; the doctor said it was expected and in fact an indication that the constricting drops are working.
Once he was satisfied with the state of my eyes, I was directed to the YAG laser room. The doc entered with a smug grin and asked me if I was ready. I had forgotten the horror stories by then and just shrugged and smiled. He reminded me that it is going to hurt a bit. That wasn’t enough of a warning, I had to actually experience it to realize how bad it would be. The doc poured some liquid into what looked like a small suction cup with a lens and stuck that to my right eye. After a lot of looking around my eye, he identified a spot and said, “Ready!”. There was a click and with it a hard flick to my eye. “That hurt a bit”, I told him and he only smiled. The first shot did not quite punch a hole in my iris and he had to take another shot. he told me the tissue of the iris of my right eye was pretty thick. “Is that good or bad?”, I asked. “In this case, not good”, he replied with a light snigger.
His sense of humour was a bit dark but I didn’t mind, maybe because I have a similar sense of humour. The second shot hurt just as much, but I knew what to expect so if was kinda OK. That did not work either so he decided to move on to the left eye. After a lot of searching, he made one shot on the left eye and we had a hole. “There was a nice spot on the left iris with thinner tissue so I knew the moment the laser fired that we had a good hole”, he said. He decided against making a third shot on the right eye and we decided to do it later.
Within minutes I started feeling a headache that grew worse by the minute. We went to his consulting room to discuss the schedule for the implants and the second iridotomy. I am scheduled to fly to Bangkok for Linaro Connect this weekend, so it had to be once I returned on 19th. The tentative schedule now is that we’ll repeat the iridotomy on the right eye on 21st, implant the lens in my left eye on 22nd and then the left eye on 25th, which was a Good Friday and hence one less working day sacrificed.
With that out of the way, I was prescribed 2 eye drops for 5 days and sent home. As we stepped out to have lunch, my head had started splitting with a headache with a mild nausea setting in, the kind one gets with a bad migrane. I could barely taste the food I had, such was the intensity of the headache at times. Ira’s constant flitting around (she’s approaching her terrible twos now) did not help things a lot. We finally got lunch over with and got into the cab back home. The nap in the cab worked wonders and that followed with an hours nap at home got rid of the headache. I am still seeing things a little darker than usual (the pupil is constricted to limit light entering the lens) but I can see sharp with my glasses, unlike the hazy overexposure due to the retina scan dilation yesterday.
The holes now mean that there is no turning back. Provided that the lens measurements don’t need to be repeated, it looks like I will be rid of my glasses before the end of the month.
The Whole Story
I usually don’t write about my very personal affairs like my eyesight (which is really poor) but I decided to make an exception this time. After a lot of mulling over it, I have decided to ‘go under the knife’ to fix my almost blind vision. I read a lot of blog posts about personal experiences and I decided to document my own experience because there aren’t any posts about the new lenses I will be getting, viz. the EyePCL by Care Group India. Most blog posts seem to be about the Visian ICL.
I am -10 diopters in both eyes with -2.75 astigmatism. This made me ineligible for the supposedly simpler (and definitely cheaper) LASIK procedure since that would leave me with little or no cornea for further corrections. The doctor advised that I do an ICL instead, which would cost four times as much (about ₹70,000 per eye as opposed to ₹35,000 for both eyes for LASIK) but would be a reversible procedure.
This was a little over a year ago and I finally took the plunge today. I went for a fresh work up today (at Vasan Eye Care, Kothrud, Pune since they did a decent LASIK job with Nisha and Siddhi’s eyes) and came back with blurry vision due to the retina check up. The highlight today was the white-to-white measurement which involved putting a clamp around my eye to prevent me from blinking while the doctor measured my eye with a vernier calliper! They had put numbing drops so that it didn’t hurt when the calliper touched the eye but it was a bit uncomfortable nevertheless.
Next in the process is an iridotomy, which involves punching one or more holes in the periphery of my eye to ease intraocular pressure. This is done because the most common side effect of IPCL is increased intraocular pressure, which could result in glaucoma. This happens tomorrow, so I hope to write another post about it tomorrow.
The madness is over. FUDCon Pune 2015 happened between 26-28 June 2015, and we successfully hosted a large number of people at MIT College of Engineering. This was not without challenges though and we met yesterday to understand what went well for us (i.e. the FUDCon volunteer team) and what could have been better. This post however is not just a summary of that discussion, since it is heavily coloured by my own impression of how we planned and executed the event.
Our bid was pretty easy to get together because we had a pretty strong organizer group at the outset and we more or less knew exactly what we wanted to do. We wanted to do a developer focussed conference that users could attend and hopefully become contributors to the Fedora project. The definition of developer is a bit liberal here, to mean any contributor who can pitch in to the Fedora project in any capacity. The only competing bid was from Phnom Penh and it wasn’t a serious competition by any stretch of imagination since its only opposition to our bid was “India has had many FUDCons before”. That combined with some serious problems with their bid (primarily cash management related) meant that Pune was the obvious choice. We had trouble getting an official verdict on the bid due to Christmas vacations in the West, but we finally had a positive verdict in January.
The call for participants went out almost immediately after the bid verdict was announced. We gave about a month for people to submit their proposals and once we did that, a lot of us set out pinging individuals and organizations within the Open Source community. This worked because we got 142 proposals, much more than we had imagined.
We had set out with the idea of doing just 3 parallel tracks because some of us were of the opinion that more tracks would simply reduce what an individual could take away from the conference. This also meant that we had at most 40 slots with workshops taking up 2 slots instead of 1.
The website took up most of my time and in hindsight, it was time that I could have put elsewhere. We struggled with Drupal as none of us knew how to wrangle it. I took the brave (foolhardy?) task of upgrading the Drupal instance and migrating all of the content, only to find out that the schedule view was terrible and incredibly non-intuitive. I don’t blame Drupal or COD for it though; I am pretty sure I missed something obvious. SaniSoft came to the rescue though and we were able to host our schedule at shdlr.com.
After the amazing response in the CfP, we were tempted to increase the number of tracks since a lot of submissions looked very promising. However, we held on tight and went about making a short list. After a lot of discussions, we finally gave in to the idea of making a separate workshop track and after even more discussions, we separated out a Container track, a Distributed Storage track and an OpenStack track. So all of a sudden, we now had 5 tracks in a day instead of 3!
Sankarshan continually reminded me to reach out to speakers at the event to make sure that their talk fit in with our goals. I could not do that, mainly because we did not have the bandwidth but also because I realize that in hindsight, our goal wasn’t refined beyond the fact that we wanted a more technical event. The result was that we made a couple of poor choices, the most notable being the opening keynote of the conference. The talk about Delivering Fedora for everyone was an excellent submission, but all of us misunderstood the content of the talk. The talk was a lot more focussed than we had thought it would be and it ended up being the wrong beginning for the conference since it seemed to scare away a lot of students.
The content profile overall however was pretty strong and most individual talks had almost full rooms. The auditorium looked empty for a lot of talks, but that was because each row of the massive auditorium could house 26 people, so even a hundred people in the auditorium filled in only the first few rows. The kernel talks had full houses and the Container, OpenStack and Storage tracks were packed. It was heartening to see some talks where many in the audience followed the speaker out to discuss the topic further with them.
One clear failure on the content front was the Barcamp idea. We did a poor job of planning it and an even poorer job of executing it.
Travel, Accommodation and Commute
We did a great job on travel and accommodation planning and execution. Travel subsidy arrangements were well planned and announced and we had regular meetings to decide on them. Accommodation was negotiated and booked well in advance and we had little issues on that front except occasionally overloaded network at the hotel. We had excellent support for visa applications as well as making sure that speakers were picked up and dropped to the airport on time. The venue was far from the hotel, so we had buses to ferry everyone across. Although that was tiring, it was done with perfect precision and we had no unpleasant surprises in the end.
Materials, Goodies and SWAG
We had over 2 months from the close of CfP to conference day, and we wasted a lot of that time when we should have been ordering and readying swag. This is probably the biggest mistake we had made in planning and it bit us quite hard near the closing weeks. We had a vendor bailing on us near the end, leading to a scramble to Raviwar Peth to try and get people to make us stuff in just over a week. We were lucky to find such vendors, but we ended up making some compromises in quality. Not in t-shirts though, since that was an old reliable vendor that we had forgotten about during the original quote-collection. He worked night and day and delivered the t-shirts and socks despite the heavy Mumbai rains.
The design team was amazing with their quick responses to our requests and made sure we had the artwork we needed. They worked with some unreasonable deadlines and demands and came out on top on all of them. The best part was getting the opportunity to host all of them together on the final day of the conference and doing a Design track where they did sessions on Inkscape, Blender and GIMP.
We struggled with some basic things with the print vendor like sizes and colours, but we were able to fix most of those problems in time.
We settled on MIT College of Engineering as the venue after considering 2 other colleges. We did not want to do the event at COEP again since they hosted the event in 2011. They had done really well, but we wanted to give another college the opportunity to host the event. I had been to MIT weeks earlier as a speaker at their technical event call Teknothon and found their students to be pretty involved in Open Source and technology in general, so it seemed natural to refer them as potential hosts. MITCOE were very positive and were willing to become hosts. With a large auditorium and acceptably good facilities, we finalized MITCOE as our venue of choice.
One of the major issues with the venue though was the layout of the session rooms. We had an auditorium, classrooms on the second floor of another building and classrooms on the 4th floor of the same building. The biggest trouble was getting from the auditorium to that other building and back. The passages were confusing and a lot of people struggled to get from one section to the other. We had put up signs, but they clearly weren’t good enough and some people just gave up and sat wherever they were. I don’t know if people left out of frustration; I hope they didn’t.
The facilities were pretty basic, but the volunteers and staff did their best to work around that. WiFi did not work on the first two days, but the internet connection for streaming talks from the main tracks worked and there were a number of people following the conference remotely.
HasGeek pitched in with videography for the main tracks and they were amazing throughout the 3 days. There were some issues on the first day in the auditorium, but they were fixed and the remainder of the conference went pretty smoothly. We also had a couple of laptops to record (but not stream) talks in other tracks. We haven’t reviewed their quality yet, so the jury is still out on how useful they were.
Volunteers and Outreach
While our CfP outreach was active and got good results, our outreach in general left a lot to be desired. Our efforts to engage student volunteers and the college were more or less non-existent until the last days of the conference. We spoke to our volunteers the first time only a couple of days before the conference and as expected, many of the volunteers did not even know what to expect from us or the conference. This meant that there was barely any connect between us.
Likewise, our media efforts were very weak. Our presence in social media was not worth talking about and we only reached out to other colleges and organizations in the last weeks of the conference. Again, we did not invest any efforts in engaging organizations to try and form a community around us. We did have a twitter outreach campaign in the last weeks, but the content of the tweets actually ended up annoying more people than making a positive difference. We failed to engage speakers to talk about their content or share teasers to build interest for their sessions.
Best. FUDPub. Ever.
After looking at some conventional venues (i.e. typical dinner and drinks places) for dinner and FUDPub, we finally settled for the idea of having the social event at a bowling arcade. Our hosts were Blu’O at the Phoenix Market City mall. The venue had everything from bowling to pool tables, from karaoke rooms to a dance floor. It had everything for everyone and everyone seemed to enjoy it immensely. I know I did, despite my arm almost falling off the next day :)
We had an approval for up to $15,000 from the Fedora budget and we got support from a couple of other Red Hat departments for $5,000 each, giving us a total room of $25,000. The final picture on the budget consumption is still work in progress as we sort out all of the bills and make reimbursements in the coming weeks. I will write another blog post describing that in detail, and also how we managed and monitored the budget over the course of the execution.
We did a pretty decent event this time and it seemed like a lot of attendees enjoyed the content a lot. We could have done a lot better on the venue front, but the efforts from the staff and volunteers were commendable. Would I do this again? maybe not, but that has more to do with wanting to get back to programming again than with the event organization itself. Setting up such a major conference is a lot of work and things only get better with practice. Occasional organizers like yours truly cannot do justice to a conference of this size if they were to do it just once every five years. This probably calls for a dedicated team that does such events.
There were also questions of whether such large conferences were relevant anymore. Some stated their preference for micro-conferences that focussed on a specific subset of the technology landscape, but others argued that having 10 conferences for 10 different technologies was taxing for budgets since it is not uncommon for an individual to be interested in more than 1 technology. In any case, this will shape the future of FUDCon and maybe even Flock, since with such a concentration of focus, Flock could end up becoming a meetup where contributors talk only about governance issues and matters specific to the Fedora project and not the broader technology spectrum that makes Fedora products.
In the end though, FUDCon is where I made friends in 2011 and again, it was the same in 2015. The conference brought people from different projects together and I got to know a lot of very interesting people. But most of all, the friends I made within our volunteer team were the biggest takeaway from the event. We did everything together, we fought and we supported each other when it mattered. There may be things I would have done differently if I did this again, but I would not have asked for a different set of people to work with.
We had a major change earlier this week, with the new fudcon.in website going live. This was a major task I was involved in over the last couple of weeks, and also one of the major reasons why we did not have a lot of visible action on the website. Hopefully you’ll see more action in the coming weeks as we come closer to the big day with just over a month to go.
Why did we do it?
The old fudcon.in website was based on Drupal 6.x with the COD module. Technically, this is a supported version of Drupal, but that is a pointless detail because every security or bug fix update was painful. The primary reason, it seemed to us, was COD. The 6.x version seemed more or less dead. We still stuck to it however, since the 7.x upgrade was far more painful than doing these updates and hacking at settings to get things working again.
That was until we decided to add the Speaker bio field to our sessions.
The COD module is very versatile and can let you ask for arbitrary information about a session. However, when you add a field, you can capture data from users, but cannot actually show it. The problem seemed to be in the way COD stored its additional data - drupal seemed unable to query it when displaying the session node and hence would refuse to show all of the additional fields, like FAS username, Twitter handle and speaker bio. Praveen and I hacked at the settings for days and couldn’t get it to work. We went live with the missing speaker bio, which apparently nobody else seemed to notice.
However, when we put out the talk list, the absence of speaker bio was evident, so I decided to take a crack at fixing it in code. I gave up because I was quickly overwhelmed by the Drupal maze of dependencies - I have spent way too long away from the web app world - and decided that I may have an easier time upgrading all of Drupal and COD to 7.x than peering at the Drupal/COD code and then maintaining a patch for it. I also felt that the upgrade would serve us better in the longer run, when we have to use the website to host a future FUDCOn - upgrading from 7.x ought to be easier than upgrading from 6.x.
How we did it
I sat back one weekend to upgrade the Drupal instance. The instructions make it sound so easy - retain the sites directory and your modules and change the rest of the code, call the Drupal update.php script and wait for it to do the magic. It is that easy, if your website does not use anything more than the popular modules. With COD, it is basically impossible to go from 6.x to 7.x, especially if you have added custom fields like we did.
Data definitions for COD seemed to have changed completely between 6.x and 7.x, making it near impossible to write a sensible migration script, especially when the migrator (yours truly) has no idea what the schema is. So I went about it the neanderthal way - remove all content, retain all users and then upgrade to Drupal 7.x from COD 6.x. That thankfully worked like a charm. This was a useful first step because it meant that at least we did not have to ask users to sign up again or add hundreds of accounts manually.
Once our user schema was on 7.x, the next task was to get COD 7.x. This again worked out quite easily since COD did not complain at all. Why would it - there was no conference content to migrate! Creating a new event and basic pages for the event was pretty straightforward and in fact, nicer since the new COD puts conference content in its own namespace. This would mean shared links being broken, but I didn’twant to bother with trying to fix that because there were only a few links that were shared out there. If this is too big a problem, we could write a .htaccess rule to do a redirect.
Adding sessions back was a challenge. It took me a while to figure out all of the data that gets added for each session and in the end I gave up due to exhaustion. Since there were just about 140 session entries to make, Praveen and I split that work and entered them ourselves. Amita and Suprith then compared content with fudcon.in to verify that it is all the same and the finally Praveen pushed the button to upgrade.
Like everything else, this upgrade taught me a few things. Web apps in general don’t think a lot about backward compatibility, which is probably justified since keeping backward compatibility often results in future designs being constrained - not something a lot of developers are comfortable with. I also had to refresh a lot of my database foo - it’s been more than 6 years since the last time I wrote any serious SQL queries.
The biggest lesson I got though was the realization that I am no longer young enough to pull an all-nighter to do a job and then come back fresh the next day.