Friday 31 August 2018

Crazy UB when casting to enum values in C++


Hi folks, this time it will be a kind of photo story! Something new this time, aaaaand I must type much less text!

It started a couple of day ago, when I saw following quiz on Twitter:

My answer was: all of them of course, in C or C++98 they would be probably all fine, but who knows what crazy UB stuff did they prepare for us in C++20. So when seen logically, all of that stuff should be undefined behaviour! But wait, dind't I used extensions to enum ranges in some project in some framework by casting integers to enums? That did work ok, didn't it? Ok, it was couple of years ago, C++03 I suppose, but AFAIK nothing has changed in C++11 in that respect, so maybe we did it wrong all the time back than? So how did it work?

So I wasn't so sure anymore, and would say A+B are UB, C is something that C++11 introduced, so no idea, but probably UB too. But wait, all of them are UB again? Probably, but the trouble with that was that there wasn't such an answer in the quiz! Crazy!

When I saw the answer I was even more perplexed:
What? Why 3 but not 4? And that numeric_limits<>::max() can't be right either! In a following tween the relevant spots in the standard were shown, and I read them, but was none the wiser - standardese isn't the lightest read on earth. ☹️

But before I gave up I remembered that I've seen something similar on Twitter already, namely in (accidently also cited) tweet by @lefticus:
So had the same gut feeling as me, but he was corrected by some people well-versed in interpreting the standard (which I also happen to follow):
and:

That was than even confirmed by @CppSage himself:

This all explains it pretty well. But isn't that crazy? Binding logical range of a type to its underlying representation? I'd understand that if we were in the 80-ties, but in C++20 standard with all that lofty newfangled features? OK, I suppose it was always like that in C, but I'm too lazy to check it, sorry.... 😩

BTW, now I've got an idea why this enum casting in a distant project might even have been working - we used Visual Studio compiler! And I dimly remember that there were some Microsoft extension for enums setting int as their underlying type. Was it so? For that I'd have to check my own blogpost from last year, but frankly, I cannot be bothered. 😩 That project has died already long ago.

TL;DR: Twitter is a good place to learn about C++, I learn almost every day!

Monday 6 August 2018

Scripting my Vim Editor


As an inveterate vim user the first thing (or one of the first things) I'd do when starting a new project under Windows is installing vim to use it for searching, grapping and editing logfiles when I'm bugfixing. Or just a to use it as a general viewer for all sorts of files.

Typically I'd set up my toolbar buttons (for vim 7.4) like that:
"" Caution: Needs 18x18 bitmaps in "C:\Program Files (x86)\Vim\vim73\bitmaps" !!!
:amenu ToolBar.-SEP- :
:tmenu ToolBar.tabedit Open new tab (mrkkrj)
:amenu ToolBar.tabedit :tabedit<cr>

:tmenu ToolBar.darklight Switch color themes (mrkkrj)
:amenu ToolBar.darklight :colo zellner<cr>
:tmenu ToolBar.lightdark Switch color themes (mrkkrj)
:amenu ToolBar.lightdark :colo torte<cr>

:tmenu ToolBar.smallfont Smaller font (mrkkrj)
:amenu ToolBar.smallfont :set guifont=Lucida_Console:h9<cr>
:tmenu ToolBar.bigfont Bigger font (mrkkrj)
:amenu ToolBar.bigfont :set guifont=Lucida_Console:h10<c>
This gives me my standard button toolbar with buttons for switching fonts and and colors from big to small and from dark to light, plus a button for opening new tabs:


Somehow this was always enough, beacuse I just used my default dark scheme  plus one more dark and an alternative light one. As you know the colors depend on the monitir you are using and the room you are sitting in, so in my new project I wanted to toggle through several schemes and font sizes before I settle down with my favorite. Thus I introduced following functions:
:tmenu ToolBar.darklight Switch color themes (mrkkrj)
:amenu ToolBar.darklight :call ToggleLightScheme()<cr>
:tmenu ToolBar.lightdark Switch color themes (mrkkrj)
:amenu ToolBar.lightdark :call ToggleDarkScheme()<cr>

:tmenu ToolBar.smallfont Smaller font (mrkkrj)
:amenu ToolBar.smallfont :set guifont=Lucida_Console:h9<cr>
:tmenu ToolBar.bigfont Bigger font (mrkkrj)
:amenu ToolBar.bigfont :call ToggleBigFontSize()<c>
Next thing is to write the actual functions to toggle between dark and light color schemes. What I did was toggling between 2 light and 2 dark color schemes like that:
:function ToggleDarkScheme()
:  if exists('g:colors_name')
:    if g:colors_name == 'evening'
:      colo darkblue
:    else
:      colo evening
:  endif
:  else
:      colo evening
:  endif
:endfunction

:function ToggleLightScheme()
:  if exists('g:colors_name')
:    if g:colors_name == 'zellner'
:      colo delek
:    else
:      colo zellner
:  endif
:  else
:      colo delek
:  endif
:endfunction
Toggling of fonts had to be done a little different, because there's no global variable that vim would set when fonts are changing like the g:colors_name above. So I had to introduce my own: g:guifonts_name:
let g:guifonts_name = 'Lucida_Console:h10'

:function ToggleBigFontSize()
:    if g:guifonts_name == 'Lucida_Console:h10'
:      set guifont=Lucida_Console:h12
:      let g:guifonts_name = 'Lucida_Console:h12'
:    else
:      set guifont=Lucida_Console:h10
:      let g:guifonts_name = 'Lucida_Console:h10'
:  endif
:endfunction
Admittedly this is not a pretty and general solution: if I needed more colors or fonts I'd have to add more ugly if-else clauses. The ceneral soultion would be of course  to create a local array of color schemes and cycle through it like it is shown here*. But "form follows function..." and "use before reuse..." ðŸ˜‰ - and frankly, I didn't have time for that, vimscript's syntax** is pretty weird.

--
* "Switch Color Schemes" in Vim Tips Wiki: http://vim.wikia.com/wiki/Switch_color_schemes

** There some good intros you can use to deepen your knowledge of vim scripting: