Superb fix for high DPI problems with Notes and Windows

If you are running a PC with a high resolution screen (pretty much any high end laptop these days for example), you’ll have experienced the issue where the scaling of text, images and icons in the Notes and Designer clients is awful to the point of being unusable.

Dan Anonielli has come up with a solution which seems to work very well for me at least: 

The basic idea is that when an application launches, Windows asks it if it is high DPI aware. Notes (well Eclipse), claims that it is and so Windows believes it and we get the resulting mess. But with the new registry entry mentioned in the blog above, we tell Windows to first look for an external manifest file that can override this default. So we can tell Windows that Notes is not high DPI aware and so Windows handles the scaling for us. 

It’s not a perfect solution, but until Eclipse / IBM come up with something better then it’s a good fix.

Edit 23rd May 2016:  I had cause to come and revisit this issue thanks to a customer and found that the solution above didn’t always work and I couldn’t find the reason. However, if you install 9.0.1 FP4 or higher then you should find that things are a lot better than they were. Still not perfect, but usable at least.

Introduction to LDC Via Webinar

We’re very aware at LDC Via that lots of people are interested in our offering, but it can be a little overwhelming to get started, so on Tuesday the 12th Jan, we’re going to talk and demo for an hour about what LDC Via is, how it works and how you can use it to liberate your Domino data.


It’s obviously a free event and we’d love to have you there. You can register here. And what better way can there be to start a new year than to attend a webinar 🙂

Of course we will all be attending or speaking at Connect 2016 in a few weeks as well, so if you would like to chat with any of us, please drop us a line to arrange a meeting. 

ICON UK is fast approaching

Every year, around this time, we have had ICON UK (formerly UKLUG) since 2007 and this year is no exception. For me it’ll be a different event. I’ve always helped out with organising something behind the scenes before, but this year we’re fully devoted to working on LDC Via so will be attending as a sponsor for the first time.

So it’ll be a busy couple of days:

  • We’ve got the booth and would love for you to come by, say hello and see what we’ve been working on for the last year.
  • We’ll be doing a sponsor session at 2pm on Monday to talk in a bit more detail about LDC Via.
  • I’ll be presenting a session titled “node.js for Domino developers” at 4pm on Monday.

All in all, it promises to be a great couple of days. If you haven’t already take a look at the agenda and register before the event fills up.

MWLUG just a week away

In just a week I’ll be flying to Atlanta to attend my first ever MWLUG event.

From what I’ve heard it’s a really good conference and I’m certainly going to be busy. I’ll be working on the LDC Via booth in the sponsor showcase most of the time so please come and say hello.

And then at 4:45pm on Thursday I’ll be presenting a new session:

Node.js for Domino Developers

The world of web development is moving pretty fast, it’s good to keep an eye on what the leaders are doing. Node.js is currently one of the most popular development platforms out there. This session will walk you through the basics. What is node.js? How do I get started? Is it something I want to get involved with? The key for all of this is that Matt is a Domino developer by trade, so we’ll talk in terms we all understand!

Hope to see you there.

End of year review 2014

This year has been one of disruption on the home front, but in a positive way. We decided to replace the main bathroom in the first half of the year but that was really just a taster for the main project of building an extension onto the back of the house. Where we had a small kitchen and a dining room that we didn’t really use very much, now we have a large kitchen diner family room that we spend the majority of our time in. It was the first time I’d tried to do a project of this scale and for the most part it was remarkably smooth. Though we were so glad when, in November after three months of work, we finally got rid of the building team. But having just done our first Christmas in the new room, we really couldn’t be happier with the changes. 

The new room after 4 months of work

The new room after 4 months of work

It did mean that there wasn’t as much time for other things as I would have liked. I realised a couple of weeks ago that I only took two weeks off work for the year, this is not enough, and will be resolved in 2015!

That being said, I worked far more from home than previously so I got to spend much more time with the family. I’m really incredibly lucky to be able to wander downstairs from work at 5:30 and spend a couple of hours with the toddler before she goes to bed of an evening. Unbelievably she is already 2.5 years old. 

On the work front 2014 has seen significant change. Domino work has become much more limited in its scope and volume. But it’s given us the opportunity to do something that we’ve been wittering about for a long time. Earlier in the year we (that is Ben, Mark, Julian and myself) started coding a new tool that has become LDC Via. This involved learning new technology stacks in the form of node.js and MongoDB. It’s been a long time since I learned so much in such a short period of time but we’ve finally reached the point where we have a product that we’re happy to share with our beta program members. You’ll hear a lot more about our offering over the next few months. 

I still do Domino work when the opportunity arises but I think at this point I’d consider myself a node.js / MongoDB developer. It’s something that was always going to happen, it just happened a little sooner than I had anticipated. 

So 2015 is going to be an exciting one: we have some big changes coming on the home front in a couple of months and at the same time I need to be rolling out a new product and continuing to develop it.



Haven’t we been busy

Over the last few months you may have seen less of me and the rest of the LDC team than usual. But there is good reason, we have been busy little bees creating a new product that hopefully you’ll be interested in if you work in the Domino world.

Over the next few weeks we’re going to cover the Why, What and How of LDC Via, and Why is first.

What we’re seeing now is that Notes and Domino are viewed (whether correctly or not—that’s a separate discussion!) as being somewhat old-fangled, uncool and—that horrible word—“legacy”. The upshot is that some organisations are reducing, or even stopping, investment in this platform. For email they may well be migrating, or have migrated, to Microsoft or Google. Those that are staying with IBM may be considering moving to IBM SmartCloud for email. So, for many, the on-premises Domino servers have become a liability, a problem, and an unwanted overhead. But—and it’s a big but—there are important business records and data locked away on those servers.

Head on over to our blog to read more…

We’re also looking for people to get involved with our beta program. If you or your company is interested then please contact us and we can let you know more detail.

Call for abstracts for ICONUK

Are you registered for ICONUK 2014 yet? Tim Clark is running the event this year, but we still need speakers. It’s that time of year again when we ask you to throw your hat into the ring if you want to speak at ICONUK. I’m running the Dev track, and I just thought it would be useful to run through the process of how we choose the sessions.

Firstly I move all of the session titles and abstracts into a spreadsheet and remove any names from the list to try and avoid falling into the trap of automatically selecting “the old dependables”. Just to give an idea, last year we had nearly three times as many abstracts as we had sessions, so this next bit of the process is very hard. We go through the abstracts and we rate each one so that there is a league table. 

Next we add in the speaker names. We try our best to get as many speakers in as possible, so if someone has played a blinder and submitted multiple abstracts which we like, we’ll either pick the best one and remove the others, or contact them to see if maybe they could speak with someone else, ideally someone new to the speaking circuit.

After much arguing and horse trading with the other tracks (sometimes we can stretch the definition of Infrastructure or Dev or Management!) we end up with a list and we’ll send out the notification emails.

And that’s where the real fun starts for you!

So please do have a think about whether you want to try out presenting in a smaller, less terrifying setting than Florida. The Call For Abstracts closes on Monday 30th June and we’ll aim to let people know as soon after that as possible.

CSC Event 2

We’re very pleased to announce our 2nd CSC Event taking place this time in the evening of July 16th.  It will still be at the fantastic Soho Hotel in London which everyone thought so successful last time but this time we’re trying something a little different.

The event begins at 6pm and will run until 9pm with only two sessions followed by drinks and possibly further drinks afterwards.

Last time our biggest feedback was that it was shame to choose between sessions and that it would be good to see both Technical and Work & Life so we will be running two sessions one after the other.  A Technical Session in the Screening Room and then we’ll all move to the Crimson Bar for a Work & Life session and since we’re in the bar, we’ll finish up with some drinks.

As always we hope for audience participation and we’d ask you to register and invite friends and colleagues – this is far outside the IBM world and our discussions are very broad.

Registration is here

Technical Session:  This will consist of 3 x 10 min presentations followed by an 30 min open floor discussion
“Exciting Technologies”
Where is the next surge in technical innovation coming from and how do we position ourselves for that?  Wearables?  Cloud Storage?  Collaborative Working? Searching and Finding Content?  Every day we discover or hear about new technologies and technology areas that sound promising.  Which ones are exciting and how do you decide between standing back, monitoring, beta testing or positioning yourself for adoption?

Work and Life Session:  This will be an open floor moderated discussion. All voices welcome!
“When People Are The Problem”
People collaborating together can do amazing things, but there is always the potential for personality clashes, misunderstandings and friction, particularly when change and issues of ownership are involved. In this session we will discuss how these issues can arise and what techniques we all use to finesse them or whether, when reality crashes in, collaborative working can work at all.

Oh yes, we’ll need a database

I suppose this is the biggie. Since 1996 my database of choice has been the venerable NSF which is a Document Oriented Database. Coincidentally the rest of the world has now caught up and there are some more modern options available to achieve the same things now. The real flavour of the moment is MongoDB, like the NSF it’s document based and can be replicated between multiple locations. Unlike the NSF it is much more scalable and the old limitations that we were constrained by have largely been mitigated. Of course there is no such thing as a silver bullet, but really there is no comparison between the NSF and modern document oriented databases. You have no idea how sad that makes me, but it is true.

Anyway, back to the real world. Every application needs a database, and my choice in this case is MongoDB. As ever, this is not a tutorial, the MongoDB website provides quite reasonable documentation to get you going:

Now, you could just go with Mongo directly, but in various tutorials I had read about MongooseJS as well. MongoDB is just a database, it doesn’t force you to implement schemas, validation, or really any structure to your data. It may well be the case that you don’t want any of that, in which case go for your life. The syntax for talking to MongoDB is very simple REST calls. But with the addition of MongooseJS we get another layer of abstraction and assistance.

The primary boon for me has been the creation of simple schema’s. Each of the document types in my application has a schema defined; I know what fields will be created, I can create default values for optional fields and I can also create server side validation rules in a nice structured way. It’s also really useful to have a standard way of running code every time a document is saved back to the database.

And at this point we have our first venture into the cloud. It’s very easy to get everything we’re talking about running on your local machine, but what about when you want to show it off to the rest of the world? In my case I’m working with Bruce so for him to test we need an environment. There are hundreds of options available to you, but as with all of these posts I’ll describe what I’m doing.

Heroku Console

Heroku Console

First we have the node.js hosting, where we’re using Heroku. It’s nice and simple to create a development instance that runs on a single dyno and then add plugins for things like console management, monitoring and so on. But we also need somewhere to host the database itself. For this we’re currently using MongoLab which again is very easy to set up a development database instance. The key thing about both of these development environments is that they are free. So I can do a Git push from my repo, and add configuration to the application to point to Mongolab for data storage and suddenly my application is running in the cloud.

MongoLab console

MongoLab console

This is all worryingly simple isn’t it? 

Jade HTML Template Engine

In the last post I mentioned that when you start using Express, you are nudged towards Jade to create your HTML.

At its simplest, Jade provides a way to make your HTML more terse. To produce HTML like this:

We would only need to enter Jade like this:

We can break up the contents of our pages, so in this example the head contents are defined in a different file and can be shared across multiple pages. The syntax is very simple, indentation is used to nest elements within each other and you can add ID, class and other attributes with simple markup.

But really the power is that you can also do simple scripting inside the Jade file. So you can conditionally load chunks of the page based on data that you pass into the page. And then you can also pass in variables to the Jade template to be used when building the HTML. In the example above, the title is passed into the page as a variable from the route configuration.

Of course you can bind field onto your database using similar techniques. 

The structure of my real world application has a layout Jade file which itself is made up of several Jade files: head, header, footer, and foot. In the head I have all of my CSS and web attributes. The header is the navigation header for the site. The footer is a static footer navigation bar and the foot contains all of the JavaScript files that need to be downloaded to the client. In between header and footer I can insert my content.

The momentum in the web development world is very much towards AngularJS at the moment which doesn’t really sit with Jade too well. It is possible to use both but they tread on each others toes rather a lot so it may end up being more trouble than it’s worth. But for my purposes Jade and jQuery are working well together for me at the moment.