So I love using Overleaf for writing latex documents, but I can’t publish them as html :( I’ve investigated this before, but decided to take the plunge today and see how publishing latex on WordPress works.
First example is the venerable 2 + 2 = 4:
So that worked, but it’s not dark and I’d like it bigger….
Now let’s give it something harder like the quadratic formula…
Okay, not bad…. let’s step it up 🙂
Again, not bad… now I like to write about quantum computing… let’s see how it does here…
Now that will not do… for reference I was using the out-of-box wordpress.com support which they kind of just throw out there… now more research…
Well, I really haven’t posted in quite a while. The reason is that I’ve been writing my book! After 14 long and grueling months, it is finally published. Please check it out at tinyurl.com/Math4QC.
So, this is my first post other than the quantum one in quite a while. I know that the way to get this started is just to put a post out there and start writing, so this is it. Hope to be here again soon. Thanks!
This is one of my favorite shirts, it’s basically a bunch of binary in a Microsoft logo…
My children love to ask, what does this mean? And as I think of an answer, my old consultant answer always come back: “It depends” 🙂 It depends on how you told the computer to interpret the binary. Looking at the binary below:
You could tell a computer, “Hey this is UTF-16” and suddenly that becomes the word “Serenity” to the computer. By default, UTF-16 encoding uses little endian on Windows.
If I change this to UTF-16 big endian, suddenly the binary is interpreted as “匀攀爀攀渀椀琀礀”. Quite a difference!
What is Most Significant?
The best example to start with is the familiar decimal numbers. For these numbers, the most significant digit is the one with the greatest value in the number. For the number 238, the number two is the most significant as it stands for 200. The least significant digit is the one with the least value in the number. For 238, this would be the number 8.
These definitions are the same in binary. The most significant bit (MSB) represents the greatest value of a binary string and the least significant bit (LSB) the least value. Let’s look at the number nine in binary:
As you can see the left most one is the MSB as it represents 8 and the right most one is the LSB as it represents one.
Big-Endian vs Little-Endian
Endianness in our context has to do with how qubits are stored in memory. For this, qubits are analogous to bits and we will use the terms MSB and LSB. Big-Endian (BE) stores the MSB at the smallest memory address and the LSB at the biggest memory address. Little-Endian (LE) does the exact opposite in that the MSB is at the largest memory address and the LSB is at the smallest address.
The next question becomes, what is the smallest and biggest memory address? Well in Q#, qubits are stored in arrays called registers as shown in this code:
// allocate a register of n qubits in |0>
use qubits = Qubit[n];
Most of the time endianness does not matter because you can specify which qubits in the register to operate on such as:
CNOT(qubits[0], qubits[1]);
But for certain operations, it matters a lot! One of these is the Quantum Fourier Transform (QFT). The Q# operation QFT expects a quantum register in Big-Endian format. If however you want to call it on a Little-Endian register, you can use the operation QFTLE (the LE at the end is for Little-Endian). There are a number of these operations in Q# such as ApplyReversedOpBE and ApplyReversedOpLE.
Now to the big reveal, what is endianness in quantum computing, specifically Q#? Well, let’s think about it. If a register is nothing more than an array, then its lowest memory address will be at the index 0 and its highest index will be at the number n – 1. Using our definitions above, this means that BE implies that the MSB is stored at index 0 and the LSB is stored at index n – 1. Indeed, this is what the documentation says for the BigEndian type in Q#:
BigEndian user defined type
Register that encodes an unsigned integer in big-endian order. The qubit with index 0 encodes the highest bit of an unsigned integer.
As I look out onto the landscape of software currently, something is missing. It’s REQUIREMENTS! 15 years ago, testing was in a similar position. What slipped first in a hurried project? Testing. But then came along Agile, Kent Beck, and xUnit. Testing now has its own methodologies like TDD, BDD, and the like.
Can you imagine if someone created a methodology called “Requirements Driven Development”?!?! They’d be laughed out the door…. 🙂
So why are Requirements being given such short shrift now? They are given lip service from the Agile community at best.
Use Cases? “Oh my god, I’m going to throw up!”
User Stories? “Yeah, just write it on a card, we’ll figure it out later, and make sure to rip it up at the end. That crap is useless.”
Really!?! Are you f’ing kidding me! This stuff matters! Well, it definitely matters when things go south on a Software Project, you know, like “we’re going to court” south. But by then it’s too late. Everyone’s trying to reverse engineer the requirements from the present software. It’s a tragic charade.
So here’s to Software Requirements. They matter. Don’t believe me? Well, when a lawyer asks you, “Was that an enhancement or a bug?” and you answer, “Uh, it was in the user story….”. Well, then we can talk 🙂
Amazing article in NY Times talks about how software is taking over everything, even our cars! We can’t see this software and it’s been exposed to hacking (Jeep being driven off the road by remote hacker) and manipulations of emissions tests at Volkswagen. People in the article argue cogently that this code should be made open-source so that we can examine it and know what we are buying as consumers. At the very least, it should go through some of the strict reviews and audits that airplane software goes through. Congress and the federal government should act before something bigger happens than the two examples above forces them to.
In today’s software development community, Use Cases are often frowned upon. A quick search on Google for “Use Cases Scrum” and you quickly find that they are put up against User Stories and quickly lose the fight. I believe in Use Cases because they force stakeholders and the development team to have the right discussions in a structured way. They also expose many things you will not think about when writing requirements in other ways.
But the art of writing Use Cases is dying. “Uncle Bob” Martin has said that it shouldn’t take longer than 15 minutes to teach someone how to write use cases[1]. He’s wrong and unfortunately hyperbolic. But these are the Agile times we live in, when everything invented before the Protestant-like reformation is looked upon as sacrilege.
I believe in Scrum. I think it can wholly benefit organizations with small teams that need to be more nimble or agile. But I don’t think Scrum is exclusive from Use Cases. Here is the definition of a product backlog from the Bible of Scrum, The Scrum Guide:
The Product Backlog is an ordered list of everything that might be needed in the product and is the single source of requirements for any changes to be made to the product.
Notice it says requirements. The Scrum Guide does not say how to do requirements (User Stories come from XP), it just says that they need to be in the Product Backlog.
So this is where my proposal for Use Case – Driven Scrum starts. Put your Use Cases in the Product Backlog. Now one of the criticisms of Use Cases is that they are too much documentation and take too long to write. Well, don’t write them out then! Just start by identifying the Use Cases you should do (give only their title). For example, put the Use Case “Log into system” into the backlog, but don’t bother to detail it out at first.
Scrum practitioners know that undefined product backlog items belong at the bottom and as they move up in priority, they become better groomed as the following picture illustrates.
This leads to the second part of my proposal. Refine the Use Cases as they move up the backlog. Add the basic flow or maybe the primary actors. This becomes part of your Product Backlog grooming.
Finally, most full use cases with all their basic and alternative flows will not fit into one sprint. So the final part is to break them down into scenarios that will fit into one sprint. Mind you that use case flows and scenarios are not the same! The basic flow is always a scenario, but mixing in the alternative flows is where it gets interesting. J
The tactics of breaking product backlog items up really depends on the tool you use for tracking your work. Spreadsheets, Rally, and Team Foundation Server all have different ways to do this. I hope you’ve enjoyed this article and would love to hear your feedback below. Good luck in your journeys of software development!
There is a great article over at O’reilly entitled “Striking parallels between mathematics and software engineering”. I’ve never really thought about the parallels of Math and Software Engineering. I’ve thought about Civil Engineering, Medicine, and the Law; but not Mathematics. To summarize, the author says that Mathematics is really about modelling and that is what we do in software engineering continually, especially when following object-oriented paradigms. It is striking and has opened my eyes to a whole other avenue to explore when it comes to Software Engineering. Just thought I’d share 🙂
I have founded a new Software Engineering company, Turing Software, LLC. Please head over there to attain services such as consulting, training, and custom application development. I’m also going to start blogging over there, so if you enjoy these articles please continue to read them over there. I’ll also post here, but it will be more of a personal nature and probably less frequent. Thanks for reading my blog!
So I tried the new virtual machines on Azure for Visual Studio. I’ve always dreamed of using a vm to do my development on, but never really trusted it because Visual Studio (VS) is such a performance hog. Well, here are my results. I downloaded “ImageResizer” from Codeplex, a popular C# program, and then built it on my local machine and the Visual Studio VM. My local machine runs 64-bit Win 8.1 Pro with a Intel i5 4670K CPU @ 3.4 GHz and 8 GB of RAM. It is also on VS Ultimate 2012. The Azure VM has a AMD Opteron Processor 4171 HE at 2.10 GHz and 3.5 GB of RAM on 64-bit Windows Server 2012 R2 Datacenter. It is running VS Pro 14 CTP (the latest and greatest).
Now, the results.
My local machine built it in ~1.1 seconds.
The VM built it in ~3.1 seconds.
A factor of 3. Not great, but not that bad either. I could see myself doing it, maybe…. lots of advantages (clean machine, always running the latest and greatest, etc.). But it still feels like it’s on the cusp of prime time.
Lesile Lamport, a Principal Researcher at Microsoft Research, has been announced by the Association for Computing Machinists (ACM) as the winner of the 2013 Turing Award winner. The Turing Award is the equivalent of the Nobel Prize in computing. I always look at these prize winners as “Gods” of the computing world and I think it’s important that we remember and honor our history if we are to progress as a profession. He won it for his his work in distributed computing, which I can tell you from my graduate course, is some very difficult stuff. He also created LaTeX among other things. Congratulations!