Monday 24 October 2022

Raspberry Pi 400 Bullseye 64-bit - PackageKit Error Timeout was Reached [fixed]

I have upgraded Raspbian to debian bullseye 64bit on my Raspberry Pi 400... and not all of the usualy software was available.  (It's possible this only happens if you have added an extra repo to buster or stretch before updating.)

Running 'Recommended Software' under Preferences I see the Recommended Software window popup and it attempts to load software, but ends with a message like "Error calling StartServiceByName for org.freedesktop.PackageKit - timed out."

Running sudo apt update (or upgrade) I see a single "Error Timeout was Reached" almost every time.

This fixed it (finally):

sudo rm /var/lib/PackageKit/transactions.db // or *.db
sudo systemctl restart packagekit.service
sudo apt install packagekit-tools // if pkcon not installed
sudo pkcon repair

[source from RedHat but it's a bug with PackageKit, which is invoked by apt :

https://bugzilla.redhat.com/show_bug.cgi?id=1188455#c3 ]

I am no longer seeing the Error Timeout on every apt update.. and Recommended Software has finished loading, so I should be able to install some more software - at last...

Wednesday 23 June 2021

2021 June Update

Nearly 5 years have passed without any comments... so what have I been up to?  Just a few things.. 

  1. From 2009 to 2016 I was taking a BSc in Health Sciences and Biology with a focus on Molecular and Cell biology and neuroscience.  I graduated in 2016..  That's a second degree...

  2. From 2017 to 2018 MSc in Molecular and Cellular Biology at Sheffield Hallam University in Sheffield... I became more interested in cancers and genomics, and generally the underlying molecular pathways in human diseases.

  3. From 2018 to 2021 I have been a PhD student at Sheffield Hallam University - with the PhD title "Identification of variants contributing to the risk of schizophrenia using high-throughput sequencing".  This is a PhD based in dry-lab/computational biology.  I presented some preliminary findings in a poster last year at a (virtual) international conference FENS 2020.  And now am writing up...  I have completed almost all of the big data work, using R Studio and Linux command-line.  The server is a beast - 500GB of RAM, 2 Xeon 32-core processors... 

  4. Last week I took part in Stepik's Bioinformatics Contest 2021 and have qualified for the next round... 5700 people signed up and ~1400 qualified.  It's a tough programming contest, but there's a good community of data scientists, bioinformaticians, and software engineers who get involved.  I finished the qualification round at position 131, two below the chunk of those at joint 1st position.  I put some code up on github for one of the questions (zorgster/bioc21q3).  I completed all of the questions in Python (but started out thinking I was going to use Java or C).

  5. Over the last few years I have completed a few Coursera courses:
    • Johns Hopkins University - Data Science specialization.  This was a tough course to do alongside a PhD - I progressed slowly - but perseverance paid off in the end.
    • DeepLearning.ai's Deep Learning specialization covering Neural Networks, CNNs, and RNNs (LSTM+GRU etc.) - this led to an interest in colouring old photos (colouring by context) and neural style transfer (blending the style of one image into the content of another).  I thoroughly recommend this as an introduction to NNs starting with the fundamentals.  Andrew Ng is great at explaining things and building it up lesson by lesson.
    • Fundamental Neuroscience for Neuroimaging - I had to read a lot about MRI studies in schizophrenia and needed to know more about all of the terminology... 
    • On and off, I take a look at Algorithms, Part I - just to improve a) knowledge of data structures, algorithms and b) to keep my Java skills going... 

  6. for Christmas I was given a Raspberry Pi 400 - Pi in a keyboard!  A hark back to the old ZX81 days.. I haven't had the time to explore it as much as I would like to - once I get the thesis out of the way.  I now have a stack of Arduinos and 2x Pis to turn to when I have some spare time.

  7. 3-4 years back, I helped produce a few video tours for a paper about Japanese sea wall defences - it was finally published 2 years ago in 2019 - Imagining Disasters in the Era of Climate Change: Is Japan’s Seawall a New Maginot Line?.  The paper has been cited 4 times to date. I thank Prof. Peter Matanle at SEAS, University of Sheffield for including me as a co-author... It was also a harrowing introduction to the review process - although I was mostly an observer of the process.  Peter is talking about a follow-up paper in the future.

My laptop has taken a beating in the last year... constant writing and tapping away during lockdown.  The power cable socket wore out and I had to install a replacement.. luckily on a Dell Inspiron 3585 it simply unscrews and unplugs from the motherboard.

Problems to solve...  my old Asus laptop hard disk strangely won't boot.  The head seems to be stuck.  This happened coincidentally - after I used a pen drive which I had imaged an installation disk for Ubuntu.  When I closed down the Ubuntu OS, and tried opening Windows again, nothing.   Just a high-pitch buzzing from the disk.  A bit gung-ho, I opened the disk quickly and pulled the head back.  I had access briefly but then the head stuck again.  Can an Ubuntu pen drive do that to a HD?  Somehow mess up the firmware?    It's forming part of the queue of things to look at when I have the time.

It would be nice to get some of the data off that disk... 

Some sites used for coding... 
https://deepnote.com/   - Jupyter notebook-style Machine Learning workspace.
https://kaggle.com/   - Machine Learning competitions related to actual research

And Family History - I've done a lot of that...  23andMe or AncestryDNA combined with family history tree-building with Ancestry is very effective...   but also very distracting, so it's on the sidelines too... 

Writing up thesis.... everything must be put on hold - unless it is directly related to the PhD.. 


Sunday 7 February 2016

Django on a Raspberry Pi 2 with Python 3.4

I've been following an old (3 yrs = old) post by Matt Woodward on setting up Django in a virtualenv on a Raspberry Pi...  but I keep seeing that Django now runs on Python 3.. and I prefer Python 3 on the Raspberry Pi... where possible.  So this is a follow-up tracing the steps I have taken to install Django in a Virtual Environment with Python 3 on the new Raspberry Pi 2.

After first trying it Matt's way, I wiped the RaPi2 and started over - easy to say with a Pi2 since it's much quicker than an old Pi...

The first step after reboot - and connecting to my Wifi... Is to update apt-get and upgrade the pre-installed packages, then do a Firmware update- which required a reboot.
sudo apt-get update
sudo apt-get upgrade
sudo rpi-update
Somewhere in the installation of Django with Python 2, I discovered I need python-dev... I tried apt-get install python3-dev and python3-setuptools... but they are already installed in the basic RaPi2 distro.  So is Nano.

Whilst looking up how to use virtualenv and virtualenvwrapper to create a Python3 environment, I discovered there's a native Python3 module that probably should be used instead, pyvenv.
sudo apt-get install python3-venv
Now I have two more commands:

  • pyvenv (should install the most recent Python version in the virtual environment)
  • pyvenv-3.4 (specifies Python 3.4)
Presuming you are in the folder where you would like to create virtual environments... we can now create a new Python 3.4 Virtual Env:
pyvenv-3.4 djenv
Unlike virtualenv, I can't use 'workon djenv' to activate it... we need to use 'source'
source djenv/bin/activate
Now I can see (djenv) added in front of the command prompt to indicate that I am working with the virtual environment, 'djenv'

pip is installed by default... so we can go right ahead and pip Django...
pip install Django==1.9.2  (taken from the DjangoProject download page
(The version might be different on the Django download page (it's now Feb 2016))

Some useful commands:
pip list = shows you which Python modules are loaded locally and their versions.  I installed Django outside of the environment and it is not available inside.  So I will run the above command inside the pyvenv.
pip show Django = shows you information about a package - here, Django.
Further reading on pip:
https://docs.python.org/3/installing/index.html#installing-index

Then we're ready to create a new Django project:
django-admin startproject mysite
See the Django website for an excellent tutorial and documentation...

https://docs.djangoproject.com/en/1.9/

PS.  where the command calls for 'python'... I have been using 'python3' on the Raspberry Pi 2... so there is no confusion.. it might not be necessary - e.g.
python3 manage.py migrate

Thursday 3 December 2015

Exp function in Windows Calculator

I posted this as a response to a question on answers.microsoft.com in 2013... and will re-post it here with additional screenshots.  (I had a notification there was a response on it today, and saw that my answer was marked as helpful 22 times).

The question was... how to use 'Exp' functions in Windows Calculator in calculations such as:

3e^(-0.5)

1.  Start -> Run -> Calc.exe  (or Press Start button and start typing Calc... )
2.  Switch to Scientific view.  View menu -> Scientific  (or press ALT-2)


3.  Press '3' and '*'  and 0.5 and '+/-'...
4.  Click the Inv button.


4.  Press the e^x button : 

Result:

NB:

The 'EXP' function is short for "* 10^x" and gives you for example:

7899 EXP 3 +/-  ->  7899.e-3   =  7.899
1 EXP 4 +/- ->  1.e-4  =  0.001

Keyboard Shortcuts in Windows Calculator

Here's a list of shortcuts for most Windows native apps.

Calculator keyboard shortcuts

Friday 24 July 2015

KRB_AP_ERR_MODIFIED - Event 4 - SBS11

Today I'm seeing several Security-Kerberos event id 4 messages on an SBS 2011 stand-alone server:

The Kerberos client received a KRB_AP_ERR_MODIFIED error from the server COMPUTER1$. The target name used was RPCSS/COMPUTER2.mydomain.local. This indicates that the target server failed to decrypt the ticket provided by the client. This can occur when the target server principal name (SPN) is registered on an account other than the account the target service is using. Please ensure that the target SPN is registered on, and only registered on, the account used by the server. This error can also happen when the target service is using a different password for the target service account than what the Kerberos Key Distribution Center (KDC) has for the target service account. Please ensure that the service on the server and the KDC are both updated to use the current password. If the server name is not fully qualified, and the target domain (MYDOMAIN.LOCAL) is different from the client domain (MYDOMAIN.LOCAL), check if there are identically named server accounts in these two domains, or use the fully-qualified name to identify the server.

It would appear that the IP address for COMPUTER1$ in the DNS, is actually being used by COMPUTER2...  so try to find the IP address for COMPUTER2...

Open Administrator Tools -> DNS
Navigate to DNS->servername->Forward Lookup Zones->mydomain.local
Order by the Data column... which may contain mostly IP addresses
Look down the Data column for duplicated IP addresses.  In my case COMPUTER1 and COMPUTER2 had the IP address 192.168.0.15

Run a CMD window (Windows Key+R->type 'cmd'->OK) and type: ping -a 192.168.0.15 (or whatever the duplicate IP address.  Also you can run nbtstat -A 192.168.0.15 ... this resolved to COMPUTER2...

In the DNS Forward Lookup Zones for mydomain.local, I deleted anything with an IP address of 192.168.0.15 that was not COMPUTER2 - one of them was COMPUTER1.

That should prevent this error message appearing again.

For each of these computers I was also seeing a DistributedCOM Event Id 10009

DCOM was unable to communicate with the computer COMPUTER1.mydomain.local using any of the configured protocols.

These appear every 30 minutes since the last reboot.  But not in the last 90 minutes since I deleted duplicate forward lookup entries... Problem solved?   Now to figure out why the server locked us all out earlier today... WSUS overgrowth on C drive is the first contender... the SSDB is 22 GB....

Wednesday 25 March 2015

Separate out a service from SVCHOST.exe (and Superfetch)

I have a runaway svchost.exe (service host) that is using 135 MB of memory.  I want to know what is using up the memory and whether I can separate it out into it's own service host, so I can watch it by itself.

Before Windows Vista and the later improvements in Task Manager, you would have to use a cmd window and tasklist.exe to find which services were being hosted by which service host.  Tasklist produces a list of all running processes:

Image Name               PID Session Name  Session#    Mem Usage
=================== ======== ============ ========= ============
System Idle Process        0 Services             0         24 K
System                     4 Services             0      1,612 K
smss.exe                 408 Services             0         88 K
csrss.exe                632 Services             0      1,944 K
csrss.exe                684 Console              1     69,172 K
wininit.exe              692 Services             0        112 K
winlogon.exe             756 Console              1      1,268 K
services.exe             796 Services             0      5,764 K
lsass.exe                804 Services             0      7,236 K
lsm.exe                  812 Services             0      1,624 K
svchost.exe              908 Services             0      4,416 K

Tasklist /svc returns all Services

Image Name              PID Services
===================== ===== ==========================================
System Idle Process       0 N/A
System                    4 N/A
smss.exe                408 N/A
csrss.exe               632 N/A
csrss.exe               684 N/A
wininit.exe             692 N/A
winlogon.exe            756 N/A
services.exe            796 N/A
lsass.exe               804 EFS, KeyIso, ProtectedStorage, SamSs
lsm.exe                 812 N/A
svchost.exe             908 DcomLaunch, PlugPlay, Power
svchost.exe             988 RpcEptMapper, RpcSs
svchost.exe             600 AudioSrv, Dhcp, eventlog,
                            HomeGroupProvider, lmhosts, wscsvc
svchost.exe             572 AudioEndpointBuilder, hidserv, Netman,
                            PcaSvc, SysMain, TrkWks, UxSms, Wlansvc,
                            wudfsvc
svchost.exe            1060 EventSystem, fdPHost, FontCache, netprofm,
                            nsi, SstpSvc, WdiServiceHost
svchost.exe            1120 AeLookupSvc, Appinfo, BITS, Browser,
                            EapHost, IKEEXT, iphlpsvc, LanmanServer,
                            ProfSvc, RasMan, Schedule, SENS,
                            ShellHWDetection, Themes, Winmgmt, wuauserv
svchost.exe            1196 gpsvc
TrustedInstaller.exe   1228 TrustedInstaller

PID 572 svchost.exe is running quite high.  If you find this svchost.exe in Task Manager, right click and choose Go To Service(s) then it will jump to the Services tab and highlight each of the services named on the right hand side.  [This is a great improvement over the poor TaskMan console in Windows XP (which could have easily been updated in a service release during the years that Windows XP was being sold.. Tho there were ways to replace TaskMan with improved versions manually.]

When I restart the service SysMain - Superfetch, the working memory used by this svchost process drops significantly.  So I want to run SysMain in its own process to watch it independently of the other processes.   To do this apparently I need to run:

sc config servicename type= own   there has to be a space after the type=.. so:

sc config SysMain type= own ... and running this gives me.. er.. failed.

[sc] OpenService failed 5

So if you get that, you don't have the elevated permission required to change the service.  Close the cmd window and run cmd as Administrator instead.  (Press the Start button and type 'cmd' then right click the cmd.exe when it appears, choose Run as Administrator from the context menu.)

Now it says:

[SC] ChangeServiceConfig SUCCESS

If you find the service in Regedit - HKLM\SYSTEM\CurrentControlSet\services\SysMain - you'll see that the Type REG_DWORD changes from 32 (0x100) to 16 (0x010) = shared and own respectively.

Restart the service.  Tasklist should now show the service has been placed in its own process - 22832.

Superfetch: this was introduced in Windows Vista.  It basically attempts to pre-empt which programs you use most frequently and then swap them in and out of the memory cache.  So what if you use the computer like me.. have 3 browser windows with a total of 25 tabs, two Visual Studio instances, Word, Regedit, Explorer, TaskMan, Excel all open at the same time..  I think Superfetch is running to its max and killing my system.  Now it's like working in treacle.  Could it be that what was designed as a 'performance enhancement' is actually unable to keep up under heavy usage?

Windows 7 has a program called Resource Monitor, which is an updated perfmon.exe  (Performance Monitor).  This is a useful program - you can select every instance of Chrome for example and see what all the windows are doing... I've got one process 12544 which is opening Cookies, my $LogFile.. my user data cache for Chrome.. and is connecting to the following:

lhr08s06-in-f9.1e100.net
sea15s02-in-f3.1e100.net
we-in-f189.1e100.net
collector.trendmd.com
lhr08s06-in-f14.1e100.net

[1e100 = 10 to the power of 100... which is a 'Googol'...]

I'm going to explore Chrome's own TaskMan to see what's running on that PID.. seems it is the main base browser instance and not any of the tabs.  Google keeping tabs on my usage...

I'm going to try living without Superfetch running for a bit... to see whether switching it off makes my laptop run any differently..




Friday 17 October 2014

Ebolaviruses - Ebola

This is going slightly off-topic... but still it's a process of thinking...

Now I appear to be switching from talking tech to talking virology... what are my credentials for doing this?   Well for the last 5 years in my spare time I have been studying for a second degree in Molecular Biology with the Open University.  One of these courses was Infectious Diseases and Public Health.  So I currently hold a Diploma in Health Sciences and am one module away from the degree - which I had to defer to next year due to getting an IT contract with UK Trade & Investment in London.

Ebolaviruses... so these are not the same as your normal Orthomyxiruses or retroviruses.. as they don't have a nice viral envelope... there are glycoproteins on the outer membrane wall...

Orthomyxvirus - - e.g. influenza 

Retrovirus - - e.g. HIV


Ebolavirus - - e.g. Ebola



A previous related virus to Ebola is the Marburg virus..  here's an image from Wikipedia - an impression of that central viral nucleoprotein - i.e. how the RNA is coiled tightly in the centre surrounded by packing proteins...

these proteins are produced by the RNA once inside a cell by the Polymerase molecule at the front end...  most of the infecting proteins are produced at the start of the RNA strand..  and the Polymerase can also restart... so you get a lot of the first proteins to begin with .. and then as polymerase transcribes more of the downstream proteins the balance tips and the Ebola virus switches from infecting the cell to producing new virions which then escape from the cell.

Entry

My first query was how does the polymerase start transcription?  It sits dormant in the virion until it enters the cell?   Another interesting topic I read up on was Apoptosis - the process of cell death.  Specifically Professor Guy Brown's "Regulation of apoptosis by the redox state of cytochrome c"
http://www.sciencedirect.com/science/article/pii/S0005272808000807

Cytochrome c exits the mitochondria and activates the apoptotic pathway that leads to cell death.  But Professor Brown queries here an observation that on entering the cytosol the cytochrome c molecule is automatically reduced and inactivated.  So there must be something that is keeping it in an oxidated and active state before it forms the apoptosome with the caspases...

So I wondered whether on entering the cell .. perhaps the polymerase is activated by reduction... an alteration of pH... a quick search on scholar.google.com ..  and  a paper in 2005 suggests (the title) - "Endosomal proteolysis of the Ebola virus glycoprotein is necessary for infection" by Kartik Chandran et al.  They identified that Cathepsin B (CatB, an endosomal cysteine protease) plays an important role (and Cathepsin L, CatL, a supporting role) in the breakdown of the viral matrix membrane - by removing GP1 from the GP glycoproteins.  They also suggest that CatB and CatL inhibitors can reduce the rapidity of the multiplication of the virus...  They operate in the slightly acidic endosome...  how would the polymerase be released with a trailing RNA into the cytosol.

There has been a lot of science since 2000 done on Ebola viruses... One can learn a lot with a few simple searches...

Ebola and Marburg enter cells using the Macropinocytosis and Endocytosis pathways.  These are standard pathways either initiated from within the cell to promote membrane homeostasis.. or externally by viruses or signalling molecules etc..  one of these is the cholesterol pathway - triggered by LDL attaching to receptors on the cell surface...

One paper suggests that chlorpromazine, an anti-psychotic, prevents the recycling of clathrin - the protein involved in endocytosis (it pulls the membrane inwards in the endocytosis process) - back to the membrane and so it inhibits endocytosis... and one team showed that chlathrin inhibition by chlorpromazine can inhibit further infection (and also sucrose has this effect) ..

So I was thinking about some basic cell biology .. and LDL and that got me thinking... could Ebola be using the cholesterol pathways ?  Since cells are constantly endocytosing LDL and recycling clathrin + LDL receptors back to the cell membrane...

One of the first (2002) papers is titled "Lipid Raft Microdomains A Gateway for Compartmentalized Trafficking of Ebola and Marburg Viruses" .. this team from Maryland, Sina Bavari et al identified that cell membrane lipid rafts (mostly cholesterol) as the gateway of viral attack - and exit.  Ebola has the ganglioside, GM1, in its viral envelope which is a marker for the lipid raft.   Can't lipid rafts be disrupted?  by statins?

Ok ... a quick search for "ebola statins" ... this has been thought of... NY Times in August.. good I'm catching up... :-/
http://www.nytimes.com/2014/08/16/opinion/can-statins-help-treat-ebola.html?_r=0

So .. yeah.. rather than giving statins pointlessly to a load of people in the West who really don't need it... why not donate it all to Africa... ?  if it's not going to cause a problem giving it to healthy people in the UK and US as a prophylactic treatment?  then give it to healthy people in Africa instead?

I'm not sure that's going to stop Ebola.. it's not a cure.. It might slow it down a little... but you need to stop it.  Mostly that can be done with quarantine/isolation of the infected and those suspected of infection, harsh as that can be on a society.  And also disinfecting with solution of bleach and PPE (protective clothing)... but there's not enough for everyone.

If interested to read about the measures required to protect oneself whilst dealing with Ebola..

Infection prevention and control guidance for care of patients in health-care settings, with focus on Ebola
http://www.who.int/csr/resources/publications/ebola/filovirus_infection_control/en/

Useful to know if you work with infected people..

More information on Ebola:
WHO - http://www.who.int/csr/resources/publications/ebola/en/
ViralZone - http://viralzone.expasy.org/viralzone/all_by_species/207.html

And all because someone somewhere..... "Bats for dinner, dear?"

Don't eat bats.  They're bad for you... and the rest of your country, world...

Amendment...

Getting carried away... if Ebola requires entry via a cholesterol pathway... LDL receptors or lipid rafts...  what if you have hypercholesterolaemia?  a reduction in LDL receptors etc..  could you be immune?  My guess is that there are many different types of hyperlipidemia caused by a variety of genetic mutations.

Perhaps if you could identify people with mutated receptors .. you could find ideal candidates to help treat the infected... :-)



Thursday 21 March 2013

QuikTip: Window Shake - Win7

Click and hold the top bar of an Window... (if it's maximized, drag the Window down a bit so it auto-restores, and then give the mouse a slow shake side-to-side...   All the Windows around it will minimize.   Another slow shake side-to-side and they all come back.

(If the Window is a modal dialog then all but its parent Window will minimize.)

I found a Windows tool recently (via a Twitter post) called AltDrag:  With this you can press ALT and click anywhere within a Window to drag it around... using CTRL as well makes the Window active..  Change options so that the Window you drag snaps to borders and taskbar (works like Aero in XP)...  AutoRemaximize - a window dragged from one screen to another will re-maximize after a second...

Unfortunately it doesn't implement the shake I mention above.  Perhaps in the next release.

Tuesday 26 February 2013

MySQL Pager Options

I've had a long hiatus from Windows Server work... I wasn't working much whilst moving away from the Peak District to York... and now I am diving back in with a digital asset management software, ResourceSpace running on a RackSpace Cloud Server (and on a local BitNami ResourceSpace VM, PHP, MySQL - not my usual bunch, but they are proving not too hard to dig deep and debug).

The MySQL console app is a very powerful .. and probably one should only go into a small function in detail, rather than attempt to explain everything that can be done with it... I've been using it in several ways over the last few weeks.. most of this will be familiar to seasoned Unix users...

This one comes in handy when I want to output a whole load of detail into a tab-delimited file, which I can then grab via ftp and stick into Excel (unfortunately not directly - I needed to open the text file in Notepad++, select all, copy, then paste into Excel - that little trick converts the Unix tabs into Excel cells correctly) - from the command prompt (not in MySQL):
echo "select field1, field2, field3 from table1 where somecriteria = true" | mysql resourcespace -uroot -p > results.txt
If I just want to log what I am seeing to inspect it later, or just in case I want to keep a track of what I have done in that session .. then on the MySQL command line \T /home/user/os.txt will log output to the file, ox.txt.  All output.  It won't stop logging until you enter \t by itself.

Anyway, today I have taken a break from using SSH to access the Cloud Server, where I can scroll back through the handy buffer.. and am using the terminal session on my local VM via VirtualBox... and there's no buffer... So here's my discovery for today (which I'll probably use later in SSH too)... (in mysql)

  \P cat > results.txt

Pager...  I just wanted to stop the results scrolling off screen, to page through them one page at a time so I can scan the records...  but I've discovered something that is altogether very powerful...  Pager - this relies on popen() - which creates a pipe, forks the results and invokes the shell to process the command - it is very useful for passing results into sh scripts which is where a lot of power can be found...  The above pager outputs only the results to a file - which is preferred over tee.  

  \P more

Pager = More: Now I can press space to jump a page, CTRL-Z if I want to jump back to the Unix prompt (fg to return to the mysql session - this works whether you use more or not), CTRL-C quits the SELECT command but not the mysql session... Another pager:

  \P less

Pager = Less: Now you can use d = page down, u = page up, r = skip, q = quit, 100g = jump to line 100, g = jump to first line, G = jump to end line, '=' = tells you how big the results are and where you are in that i.e 85%, /pattern = search forward for regex pattern, ?pattern = search back for pattern and so on.

  \P less -I -p "headline"

Pager = Less (-p highlight text) Now that's pretty cool... on all mysql output, the word 'headline' is highlighted (-I = case insensitive).. useful if I am trying to look out for fields that contain a particular word or phrase... 

  \P less -S

Pager = Less (-S = no wrap) stops wrapping of lines in the results, but if fields contain carriage returns then they do drop to the next line - you can use the left and right arrows to view columns that are off-screen... (or you can use \G instead of \g to run your SELECT statement ( SELECT * FROM table \G ) to view rows as a single column.

A more complex version (from the MySql pages) of cat -> tee -> less can log to two different files at the same time and output to less:

  \P cat | tee /tmp/file1 | tee /home/user1/file2 | less -niSFX

  \P /tmp/grep_cmd

Pager = grepping (found on the SQL Performance Blog) this requires creating a file in /tmp which I am calling grep_cmd with the following text -:
#!/bin/sh
grep -A 1 -B 1 -i --color 'headline'
You could add a pipe into less (" | less ") after 'headline' or add the pipe to the \P command above to page the output... Then make this an executable using chmod a+x /tmp/grep_cmd - Now once the pager is set, mysql will only print out the record with a match, and a single record above and below it.  Matches will be highlighted or coloured.  You could just put the grep command in the pager.. but the script could be built on.


  \P cat > /dev/null

Pager = (hide output) This hides the record output from view.  You can see the number of rows returned and number of seconds to run the query.   Useful for comparing sets of queries, where you aren't interested in the output.


  \P vim -

Pager = vim  see Daily Vim - I don't use vim - but this might be of use to those who do.  I imagine it's useful to throw the results directly into a text editor...


  \P tr -d '`'

Pager = trim backquotes (one backquote surrounded by single quotes)- This was posted by Giuseppe Maxima here.  This can be used to strip out characters from your result set.  His example (above) is to strip the backquotes from a ( SHOW CREATE TABLE table1 \G ) statement.  I found that you can add different characters between the single quotes and it'll strip them away too.. even whole words like 'NULL'.  It can also strip out the vertical lines from a table result set ('|')...

Finally... for anyone who gets this far... log into mysql using the first line... and then call the second.  (I saw this tip on Parvesh Garg's blog Optim MySQL )


  mysql databasename --xml -uroot -p 
  \P cat | tee /home/user1/output.xml | less -niSFX

This outputs the results to an XML file and to the screen.  The XML file output is well-formed too.


Friday 3 February 2012

PDF Printing in Google Chrome: Bug? or should I worry?

I just received a PDF in my email ... and thought I'd view it and print it out.

First report ok.. my client's report looked fine.  Second report... er... The letters in their logo were scrambled and replaced with other letters.

The report was generated in MS Access... an Access Report that I originally created for my client.  It includes their logo as a BMP.  My client prints the report to a PDF, using something like CutePDF or pdf995.  He then emails the reports to my GApps.

I click on View next the the PDF.  And the Print.  I can then see the standard PDF print dialog.  So far the document looks ok to me.

However on the hard copy the text from the Logo has changed:


So I closed the PDF in the browser window, went back to my email, clicked on View again and re-printed... and all was fine.

Anybody want to hazard a guess what is happening?  Thoughts...
  • Is Google OCRing my [private] PDF, when it attempts to convert to a printable file?
  • The printer driver on my computer is doing it?  How does it know that I am printing letters in the PDF?
  • A virus?
  • If Google is putting the PDF through an OCR so that I can search it within Google Docs... then should I be worried about that?  
I think I will stick to using the Download option in future.  And print via Adobe Acrobat Viewer... Google Docs might be convenient, but I'm not sure I can trust the Chrome/Docs print service with confidential PDFs, until I know why this logo was not printed wysywig...


Wednesday 1 February 2012

Crystal Reports: Sorting

I have a complicated Crystal Reports report - Management Report.   The report is based on a single table, which has been built up in Visual Studio, within my VB.NET Project.  The table pulls in data from about 7 different tables, with inner and outer joins... It contains all the fields that I need to calculate the sorting, as well as the display fields, and other fields I will use to calculate display fields.

For example,  ItemUidT1, ItemUidT2 are GUIDs from two tables and ItemRefT1, ItemRefT2 are strings from those two tables.  These two tables link to a main table by their respective ItemUids.


But I have one column in the report for printing ItemRef.  I want to be able to sort the whole table into all values from Table1 and then all values from Table2... but I also want the user to click a button on the form to change the sort order to an Index or Count value for all items irrespective of their Table.  Take it that the schema for Table1 and Table2 are very different, which is why they should not be stored in the same table with a 'flag' field.

This is not about building such a table... but using it in Crystal Reports.  The table could look like this:


fyi:
SELECT MainTable.MainTableUid,
               MainTable.ItemUidT1,
               Table1.ItemRefT1,
               MainTable.ItemUidT2,
               Table2.ItemRefT2,
               MainTable.OtherData
FROM Table2 RIGHT JOIN
             (Table1 RIGHT JOIN MainTable ON Table1.ItemUidT1 = MainTable.ItemUidT1)
                      ON Table2.ItemUidT2 = MainTable.ItemUidT2;
In the report I have created formula fields for displaying... so Formula: {@ItemRef} = [note: code is in VB Syntax not Crystal Syntax]
If Not IsNull({QueryMain.ItemUid1}) Then
    formula = {QueryMain.ItemRef1}
Else
    If Not IsNull({QueryMain.ItemUid2}) Then
        formula = {QueryMain.ItemRef2}
    Else
        formula = "n/a"
    End If
End If
My actual report has 8 sort fields.

SortFields(0):  Not linked to an Item Table

On one report I would like to move all the Items in the list which are not linked to an Item Table down to the bottom of the list.  To do this I create a new Formula called {@IsNotInItemTable1} =
If IsNull({QueryMain.ItemUid1}) And IsNull({QueryMain.ItemUid2}) Then
    formula = 2
Else
    formula = 1
End If
This is added to Report -> Record Sort Expert - sorted Ascending.

SortFields(1):  Alphabetical List of ItemRef

From the first sort, we have two groups of records... this in the Item Tables and those not.  I would like to sort by ItemRef next... the formula is the same as for {@ItemRef} above, but call it {@ItemRef1} for reasons that will be clear below.  And
formula = {@ItemRef}


SortFields(2):  OtherData

I first tried to add {QueryMain.OtherData} as a sort field on its own... but I ran into problems with it.  If I want to assign this field to SortFields(0) in my code, then CrystalReports keeps on throwing an error:

System.Runtime.InteropServices.COMException was unhandled
  ErrorCode=-2147213305
  Message="The sorting already exists"
  Source="RptControllers.dll"
  StackTrace:
       at CrystalDecisions.ReportAppServer.Controllers.SortControllerClass.Add(Int32 IndexToAdd, ISCRSort Sort)
       at CrystalDecisions.CrystalReports.Engine.SortField.set_Field(FieldDefinition value)

I'm not sure how to get around this to allow for assigning random fields... I can't seem to remove one field from the sorting so that I can add it elsewhere in the collection.  Maybe, using Reflection... My workaround however works by creating rearranged sets of SortFields for each sorting.  To prepare for the workaround, create a simple formula: {@SortOtherData1} =
formula = {QueryMain.OtherData}
SortFields(3-7) .. 

Other fields sort on other fields not listed... PriorityOrder1, IsGreen1, TotalScore1, .. etc.

In my code I load up the report...
daMain.Fill(rds.QueryMain)
Dim rpt As New ManagementReport
rpt.SetDataSource(rds)
rpt.SetParameterValue("Param1", paramString1)
rpt.SetParameterValue("Param2", paramDate2)
CrystalReportViewer1.ReportSource = rpt
I have toggle buttons on the form, to select different sort orders:
Sort 1 - is the default sort order.
Sort 2 - By Item Ref
Sort 3 - By Highest Score
The issue I was having was that when I assigned FormulaFieldDefinition {@ItemRef1} to SortFields(0) I would get the COMException (as above) from CrystalReports.  The answer was to make an exact copy of {@ItemRef1}'s code and save as a new formula {@ItemRef2}.  Repeat this for all SortFields.  Now the reason for not using FieldDefinitions for sorting, and converting them to FormulaFieldDefinitions becomes apparent...

If you have assigned a FieldDefinition to a SortField, you cannot assign it to another SortField without removing it first.  But if you create a FormulaFieldDefinition based on the field, then you can have copies of the FormulaFieldDefinition and assign them without the clash.  You simply have to make sure you replace all SortFields in one go...  Something like the following resorts the loaded report without having to reload the data... works like a charm... :-)

[Note: All the SortFields assigned by default end with the number, 1.  And assuming only 5 fields assigned.]


Sub Sort2ByItemRef()
            rpt = CType(CrystalReportViewer1.ReportSource, ManagementReport)
            Dim thisSort As SortFields = rpt.DataDefinition.SortFields
            thisSort(0).Field = rpt.DataDefinition.FormulaFields.Item("ItemRef2")
            thisSort(1).Field = rpt.DataDefinition.FormulaFields.Item("SortOtherData2")
            thisSort(2).Field = rpt.DataDefinition.FormulaFields.Item("IsNotInItemTable2")
            thisSort(3).Field = rpt.DataDefinition.FormulaFields.Item("SortPriority2")
            thisSort(4).Field = rpt.DataDefinition.FormulaFields.Item("SortHighestScore2")
            CrystalReportViewer1
.ReportSource = rpt 
End Sub
Sub Sort3ByItemRef()
            rpt = CType(CrystalReportViewer1.ReportSource, ManagementReport)
            Dim thisSort As SortFields = rpt.DataDefinition.SortFields
            thisSort(0).Field = rpt.DataDefinition.FormulaFields.Item("SortHighestScore3")
            thisSort(1).Field = rpt.DataDefinition.FormulaFields.Item("SortPriority3")
            thisSort(2).Field = rpt.DataDefinition.FormulaFields.Item("IsNotInItemTable3")
            thisSort(3).Field = rpt.DataDefinition.FormulaFields.Item("SortOtherData3")
            thisSort(4).Field = rpt.DataDefinition.FormulaFields.Item("ItemRef3")
            CrystalReportViewer1.ReportSource = rpt
End Sub

I've got it working... Added a new Sub called Sort1Default .. to revert to default sort order (by assigning sort fields ending in '1' in the default order...

And those were the simpler reports... now onto the massive Detailed Customer report...  It's taken a few hours to get my head around these SortFields and dynamically changing them ... no doubt someone is going to tell me I could have done it a simpler way... Do let me know... As it stands, these reports were being run from Access and that was fraught with difficulties because first you have to find the most up-to-date MDB... then check it works with the most up-to-date backend MDB... and then check to see if the queries are up-to-date...  That's many hundreds hours saved in the long term.

Thursday 3 November 2011

old Compaq nx9005 - RIP

I was working on my nx9005 last night, taking some study notes... when there was the sound of a small tick and then a smell of burning electronics. I unplugged it from the life support system (power supply) and removed the battery.

The fan had been overworking for many a year.  I had performed surgery on the laptop several times to unclog the arteries within the fan unit from dust... but this was not enough... the multiple surgeries on the machine led to several further side effects due to aging of joints, such as limb failure (TouchPad) and front panel failure (power/wifi/BT indicator leds).

I pronounced my nx9005 dead at 5:30am 2nd Nov 2011 at the ripe old age of 8 1/2 years.



RIP.

The hard disk, WiFi card, optical reader and power supply were offered for donor parts.  All other parts are scrap.



The nx9005 had received a fair old innings as a development machine, media machine, word processing and spreadsheets, remote desktop support ... but spent its last year or two retired for general lap warming and as a TV guide / general browser duties...


[Edit 14:25 - old habits die hard:


Here's a photo after a few minutes of post-mortem... There's a good HD, Fan Unit, DVD Writer, Floppy, HD Caddy, AMD Processor, RAM PC2100 2x 512MB and a set of screws for the nx9005 (less one that is stuck in the casing)




[Edit 03:33 - I've had a Eureka moment...


A long time ago when I was (probably) cleaning the fan, I remember a small component fell off.. but I could not place it.  The PC worked so I simply ignored it.


As I was taping all the screws into a nice ordered file, ready for salvaging for spare parts... The other grey square component fell off - again I didn't notice where it came from.. But I have just realised I could download a photo of the motherboard from Google, and compare mine with a new one... The circled area is missing a component!  (Now it is missing both - so I could have simply compared it with this picture I took earlier.)  And the smell of burning is strongest around this area... I wonder what these components are for.. Could I attach a couple of new ones and get the old laptop working again?   I would, of course, have to fish the battery, and other bits and pieces out of the bin... nah... I don't think I can be bothered...   


For anyone who want a service manual, I found this:
        http://www.hp.com/ctg/Manual/c00246219.pdf
It covers nx9005, nx9000, nx9008, nx9010 and the ze series, and a few Presarios...


There's a handy list of POST Codes and Beep codes too... and diagrams of how to take your nx9005 to pieces... 


Having looked around eBay there are plenty of people selling old parts... even faulty motherboards... I think I might have a go to see how much money I can make from scrap ... than I could selling it as a faulty whole... 



Wednesday 28 September 2011

Visual Studio 2008 and the corrupt (ADO-based-only) Dataset Editor

I have been struggling with VS2008 and the DataSet Designer for the last .. nearly 24 hours.  I've got one project with 5x ADO Datasets and one Dataset that is not attached to a database via a connection string.  I've not had this project open for a while... during which the following events have happened:

1) SQL Server 2005 Security Update for SP3 (KB2494113) failed because I had to catch a train - leaving SQL Server 2005 Standard in a corrupt and un-runable state... I managed to attach the database I was working on into SQL 2008 Express - so I didn't lose much in time... I have left SQL 2005 in that state since today and successfully installed SQL 2005 SP4 which has revived my corrupt SQL Server and the database is still attached and working... (now to update it with changes since then)

2) Several VS2008 Post SP1 Updates - namely: KB2538241, KB2251487, KB971092... It looks like I have installed them about 10 times each... this was down to what I think was a fault with WindowsUpdate in that it kept on offering me the same updates over and over - even after a successful install.... I managed to fix those and some recurring Office 2007 Updates with the automated FixIt from: http://support.microsoft.com/kb/971058

So... now my VS2008 projects open, compile and run perfectly well... all except from trying to open them in the "DataSet Editor" (right click the DataSet and choose Open With... you'll see a list of options.  It's a poor design because despite the great amount of control you have over add-ins and developing for VS you aren't shown what actions are taken when you click on any of the items in this list.  My guess is that you have to check the registry...

I select DataSet Editor from the list and I get this helpful error message:

"Load DataSet Error:  Failed to load dataset because of the following error:  Illegal characters in path"

Er... what path?  Click ok and the panel is filled... "To prevent possible data loss before loading the designer, the following errors must be resolved: Illegal characters in path.  Show call stack:"

at System.IO.Path.CheckInvalidPathChars(String path)
at System.IO.Path.IsPathRooted(String path)
at Microsoft.VSDesigner.Data.Local.ConnectionStringConverter.ToRunTime(Project project, ConnectionString csToConvert)
at Microsoft.VSDesigner.VSDesignerPackage.AppSettingsHelper.AddServerExplorerConnections(IServiceProvider serviceProvider, IList connections)
at Microsoft.VSDesigner.VSDesignerPackage.GlobalConnectionService.Microsoft.VSDesigner.VSDesignerPackage.IGlobalConnectionService.GetConnections(IServiceProvider serviceProvider, Project project)
at Microsoft.VSDesigner.DataSource.DesignConnection.GetConnectionFromAppSettings(String objectName, String propertyName, Project currentProject)
at Microsoft.VSDesigner.DataSource.DesignConnection.GetConnectionStringObject(Project targetProject)
at Microsoft.VSDesigner.DataSource.DesignConnection.get_ConnectionStringObject()
at Microsoft.VSDesigner.DataSource.DbSource.set_Connection(IDesignConnection value)
at Microsoft.VSDesigner.DataSource.DesignDataSource.SetConnectionProperty(Source source)
at Microsoft.VSDesigner.DataSource.DesignDataSource.ReadXmlSchema(DataSourceXmlTextReader xmlReader)
at Microsoft.VSDesigner.DataSource.DesignDataSource.ReadXmlSchema(TextReader textReader)
at Microsoft.VSDesigner.DataSource.Designer.DataSourceSerializationService.DeserializeToDataSource(String filePath, Object serializationData)
at Microsoft.VSDesigner.DataSource.ProjectDataSourceDescriptor.LoadDataSource()
at Microsoft.VSDesigner.DataSource.ProjectDataSourceDescriptor.Init(IServiceProvider provider, IVsHierarchy primaryHierarchy, UInt32 primaryItemId, Object primaryDocDataObject, UInt32 docCookie, IVsInvisibleEditor invisibleEditor, IDesignerHost host)
at Microsoft.VSDesigner.DataSource.ProjectDataSourceStorage.EnsureInvisibleEditor(ProjectItem dsProjectItem, Boolean ensureWritable, Boolean createInvisibleEditor, Boolean getUIInfo)
at Microsoft.VSDesigner.DataSource.ProjectDataSourceStorage.GetDataSourceInternal(Object caller, ProjectItem dsProjectItem, Boolean ensureWritable, Boolean createInvisibleEditor, Boolean getUIInfo)
at Microsoft.VSDesigner.DataSource.Designer.DataSourceDesignerLoader.HandleLoad(IDesignerSerializationManager serializationManager)


Ok,, I get there's a problem... I created a new project, and a new dataset using another database, a different driver (SQL not Jet), the designer loaded and I added a few tables... close the designer... re-open the designer and the same error message appears...  so it's not something in the ConnectionString - because I can add tables to a new DataSet in the Designer... and it's not to do with the Designer because I can still create DataSets...

I have another DataSet that has some hand-built tables... none of the tables in the Dataset are from databases... and this DataSet loads into the Designer as expected... no problems...

A further curious addition to this problem... Double click on DataSet1.xsc (View All Files must be checked to see the auto-generated .XSC file)... I can see the XML, but now another message pops up:

"MS Visual Studio:  Package Load Failure
Package 'Microsoft.VisualStudio.XsdDesigner.Package.DesignerPackage' has failed to load properly ( GUID = {20AAF8FA-14C0-4897-8CA0-4D861E2B1212} ).  Please contact package vendor for assistance.  Application restart is recommended, due to possible environment corruption.  Would you like to disable loading this package in the future?  You may use 'devenv /resetskippkgs' to re-enable package loading."  Yes/No

No, thanks...   But there's a GUID.. before closing that message.. open Regedit... HK_Local Machine.. Software.. Microsoft.. Visual Studio.. 9.0 .. look down the list, I see Packages.. and 20AA ->

CodeBase = C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE\PrivateAssemblies\Microsoft.VisualStudio.XsdDesignerPackage.dll
InProcServer32 = C:\Windows\system32\mscoree.dll
MinEdition = Standard
ProductName = XML Schema Designer
ProductVersion = 1.0

Looks ok to me.. the file exists in that location - version 9.0.30729.1 - 30/07/2008.

I see people suggesting starting devenv with switches such as 'devenv /resetsettings' 'devenv /resetskippkgs', 'devenv /setup' ... I tried them all... I even ran 'devenv /log' to create a log of the problem... but the log doesn't show the right level of detail around the event... there are warnings about loading SpecExplorer, CodeRush and another third-party add-in... Refactor?.. but none of these should interfere with the Dataset Designer...

I have now tried, right-clicking on a dataset and choosing to "Run Custom Tool" ... but the custom tool, MSDataSetGenerator, throws the same 'Illegal characters in path' error and this leaves my project with over 100 errors... most of which are now: "Type 'SurveyDataSet' is not defined" "Type ttt is not defined"  - the first error is "Custom Tool Error: Failed to generate code.  Illegal characters in path" ...

Because I tried to open the XSC file and it played with the XsdDesigner.. I'm also getting this:

Warning  104  The 'urn:schemas-microsoft-com:xml-msdatasource:DataSetUISetting' element is not declared. C:\Data\Projects\Test1\Datasets\DataSet1.xsc

I can ignore that .. because once I press compile it'll all go away... Nope.. it didn't go away... I'll try restarting VS2008 - hmm still all 'not defined'... Check the recycle bin - and the DataSet1.Designer.vb file is in there!!  Restore, refresh the Solution Explorer and include DataSet1.Designer.vb in project... and all back to normal...  well to whatever 'normal' is defined as today at least...

So.. where is the problem?  It must be something the Designer is loading?  Because the designer loads, and only fails to continue loading if the database links to ADO... It's trying to read the XML Schema.. which then has to get the connection string.. it's getting the connection string from AppSettings... and then failing...

When I load app.config - I receive the above Package Load Failure - for XsdDesigner ... and now the error messages are picking up on schema information:

Message  Could not find schema information for the element 'userSettings'. C:\Data\Projects\Test\app.config

Message  Could not find schema information for the element 'Test.My.MySettings'. C:\Data\Projects\Test\app.config
[..... and so on... for 'setting' 'name' 'serializeAs' 'value' ... until the end of the file...]

One of the updates was a new XmlEditor file... MS11-049 Security Update for VS2008 SP1: June 14, 2011 (KB2251487).  This includes a new Microsoft.XmlEditor.dll ... no other supporting files... or changes to the Designer.. MS11-049 is really about XML throughout the Operating System... it affected SQL Server, VS and MSXML...  If the Editor were at fault, then I would have a problem with ALL datasets?

-------------------  FIXED -------------------
To cut this long story short... I fixed it... I decided to download the last Service Pack.. To re-install VS2008 SP1.  The Microsoft.XmlEditor.dll file above is still there, so it may not have overwritten newer files... but it has sorted out whatever was corrupted in the files, registry or oledb_services... I can now edit DataSets again...

In order to re-install VS2008 SP1... I had to clear out 4GBs off my hard disk.. SP1 requires about 5.6GB free space... and it's left me with 4.3GB free (there's half a gig in a folder called 'Microsoft Visual Studio 2008 SP1' in my temp folder.. I might just copy that to my spare disk - just in case..)

At least this exercise has introduced me to VS Packages, how they work... I've fixed my broken SQL Server 2005 Standard install... and had a few ideas about moving on... (upgrading projects to VS2010, moving VS2008 to a VirtualMachine, or having one VirtualMachine with VS2008 or VS2010 per client... and all their individual projects... I may only have bought myself a few more weeks... this Lenovo 3000 is nearing it's 'Use By' date anyway...

Back to work...

Saturday 11 June 2011

Orphaned Mailboxes and Users after migration from SBS 2003 to SBS 2011

I got a call today from a colleague doing a migration from SBS2003 to SBS2011 ... Exchange 2003 to 2010.

Mailboxes had been copied and he had decommissioned the old server to find that the mailboxes were not there... they were not showing up in the new server... had deleting them removed them from both servers?

At that stage you will have disconnected the connector between the two servers...

First .. go to the disk and check the size of the Exchange Store on the disk... The new store should be the same number of Gb as the old one...

Second.. open the Exchange Mgmt Console and type

  • Get-MailboxStatistics -Server servername
This should list all the mailboxes on the server.  You need to add a little bit to get more details... 
  • Get-MailboxStatistics -Server servername | select DisplayName, DisconnectReason, LegacyDN, ItemCount, MailboxGuid, Database
Moved mailboxes all had a LegacyDN that contains 'first adminstrative group' - this is normal and there's no need to spend hours, as I did, thinking it was an error and trying to correct it... Apparently that doesn't matter...

What really mattered was time... wait...  The mailboxes we had transferred over were large - around 2-4GB... they were neither disabled nor anything... the new server simply had not finished processing them.  I came back to the problem about 12 hours later and they were showing up... perhaps it took less time, but they eventually showed up... in the meantime, I had created some problems for myself...

Users were showing up in AD Users and Computers, but not showing up in SBS Console - I tried remedying that using the 'Change User Roles' trick   But that just created a second mailbox that started collecting emails ... and didn't connect to the old mailbox...  Worry about users not showing up in SBS Console AFTER the mailboxes show up.  When all your users are connected to their mailboxes then run the 'Change User Role' trick.

If you have created a duplicate mailbox, the ItemCount has increased and you really don't want to lose a single email, try connecting via OWA - you might be able to hook up the 'Archive' function and archive those emails out for later retrieval...  The mailboxes I created only picked up 1 or 2 emails and so I disabled the mailbox and used the Get-DisconnectedMailbox username | Remove-DisconnectedMailbox code from Mike Pfeiffer to get rid of the newly created mailbox.

Soon.. all mailboxes were showing up - they appeared in Disconnected Mailboxes in Exchange Manager.  So I selected the mailbox and tried to connect them to a user... but the user was not showing up... The mailbox name was correct - but Exchange could not see the users in Active Directory.. strange...
  • Get-User
Very simple... it produces a list of users and their RecipientType.  The users that Exchange could not see had a type of 'User' but those Exchange can see have a type 'UserMailbox'....  how to enable them?
  • Get-User | where-object{$_.Name -eq "User's Name"}   - this should list one user (just to check before changing anything...)  if ok, run this:
  • Get-MailboxDatabase -Server servername     - this gets the name of the database to use in this:
  • Get-User | where-object($_.Name -eq "user's name") | Enable-Mailbox -Database "database name"
Now run Get-User and check the user has changed RecipientType to UserMailbox.

Now you can go back to Disconnected Mailboxes in Exchange Manager and run Connect to connect them to their mailbox... (if it didn't do it for you when you ran Enable-Mailbox).

Once you have connected all your users up.. go to SBS Console -> Users and Groups -> Users... and run the 'Change User Roles' task - select 'Standard User' in the roles and below select 'Add Role to Users' (NOT 'Replace' - then you are not changing anything).  On the next page you click the checkbox at the bottom 'Show all users from AD' and select all the users (one to start with to test) who you don't see in SBS Console...  run it all through and refresh the view.. all the users should now show up.

One last thing I forgot... and perhaps a trick to get around this issue too... when you run the 'Change User Roles' task each user is given an email address based on the current 'Email Address Policy' ... this is in Exchange Manager.. under Organization Configuration -> Hub Transport -> E-mail Address Policies -> Windows SBS Email Address Policy ... unfortunately 'First Name Only' is NOT an option.. duh!  But we can get around that...
  1. Start -> Run -> adsiedit.msc  <- take care with this... if you get this far you probably are fine...
  2. right-click on AdsiEdit -> Connect To... -> Select a Well-known Naming Context -> Configuration
  3. Select Configuration in the console left panel -> then Configuration -> Services -> Microsoft Exchange ->  servername -> Recipient Policies
  4. In the right-hand panel for Recipient Policies - right-click Windows SBS Email Address Policy select Properties
  5. Click the Filter button and choose to show only attributes that have values
  6. gatewayProxy has a value of SMTP:%m@mydomain.com chang
  7. Click gatewayProxy and Edit - change to SMTP:%g@mydomain.com - %g is FirstName
  8. click ok and exit out of AdsiEdit.msc
And finally .. a warning if you have Blackberry Enterprise Server running... I think I read somewhere there was a knowledgebase article from RIM about what to do in the migration.. .. it was for Exchange 2003 to 2007.. but best to take heed of any advice before attempting to migrate...