The specified cast from a materialized ‘System.Int64′ type to the ‘System.Int32′ type is not valid

I ran into a sql related .net error today.  If you read the title of the post you’ve probably guessed what it is.  If not here’s the error:

The specified cast from a materialized ‘System.Int64′ type to the ‘System.Int32′ type is not valid.

Google was marginally helpful, but if you’re like me when you google an error you’re hoping for that post that says:  If you see ABC, then you have done XYZ, do 123 to fix the error.  In this case the error is saying there’s a datatype problem.  Something about a Long and an Int.  My app only has ints.  No longs in the schema at all.  This specific error came out of an entity framework method, so I couldn’t easily pinpoint it to a given column.  98% of my app is pure entity framework, mostly code-first (though I do write out transactional schema patches to update the database in a scripted manner.)  There is one stored procedure in the app, and this stored proc I had just changed to add some new features specifically paging from the sproc.

In this case, the sproc looked like this:

Reports_MyReport SomeGuidID

it returned data of the report, field1,field2, field3 etc..

My change was to add paging directly to the sproc, reduce the amount of data leaving the box as this report was going to get hit a lot.

Reports_MyReport SomeGuid, PageNumber, PageSize

it returns data like RowNumber, Field1, Field2, Field3, TotalRows

I tested out the changes, they worked great, no nulls where they weren’t expected.

Upon running the new sproc through my app, i got the error listed above.  It turned out that my sproc, which had code like this:

select RowNumber, Field1, Field2, Field3, @totalRows as TotalRows  ….

was the culprit.  @totalRows was being interpeted as a int64, as that was comming from an @@ROWCOUNT function.  I know i’ll never have more than int32 rows in that table, so for me switching by casting to Int32 solved the problem:

select RowNumber, Field1, Field2, Field3, cast(@totalRows as int) as TotalRows  ….

Problem solved, error gone!

Hopefully by the time I have completely forgotten about this, and make the exact same mistake again – in six months – this post will be living in the googles.  Hopefully this helps someone else as well.

 

Scott

 

, , ,

Leave a comment

Maximum call stack size exceeded when using Modals and AngularJS

Saw a really wierd error today, as you can guess from the title of the post it has to do with Maximum Call stack size exceeded, modals and AngularJS.   First a bit about the app.  I’m currently building a table of data, its basically a status report, so the url might be http://mysite.com/#/report/myreport  which would show you the list of data.  However I’m also building a detail page that will use a modal popup when you double click on a given row of the report.   I wanted this to be addressable so it would have its own url.  So basically i’ve got my angularjs route setup with an optional rowid parameter.

Request this:  http://mysite.com/#/report/myreport and you see the report show up.

Request this: http://mysite.com/#/report/myreport/32 and you see the report in the background with a modal overlay showing the details of row 32.

You can close the modal, view other records, the url changes as you do so.  All in all, its a pretty simple angularjs app.  No new concepts for me, I’ve used modal before, controllers etc all of it.  However, today when i wired up my code to load the modal when the page first loads, I saw an evil error in chrome:

Uncaught RangeError: Maximum call stack size exceeded.  jQuery.event.special.focus.trigger…. tons of jquery calls ….

Only in chrome, FIrefox acted a bit wierd, and IE well IE i’ll save for another day.  I tried googling it, all the results pointed to a recursive loop, but i had no recursive code.  Google was actually pretty unhelpful for this error.  I backed out all my changes for the instant modal popup functionality, and instead just wrote a log entry to the console.

The error didn’t appear, the log entry did appear, twice.  Somehow my controller was loading and running twice.  Googling that was helpful, I found this:  http://stackoverflow.com/questions/15535336/combating-angularjs-executing-controller-twice

It turned out I was doing the exact same thing.  I had no idea you were not supposed to register the controller at the top of the page if you were using routing.  Removing that single html attribute solved the dual log entry problem.  I then re-enabled my instant modal on load code, and that all worked without error as well.  Somehow with the controller loading twice and my code running at startup caused some wierd stack overflow.

Once it was loading only once the problem went away.  So here’s another post for google, I’m sure I’ll make this mistake again sometime,  probably in about three months or so.  Hopefully then I’ll google it and find my own blog post.

Leave a comment

Bundling AngularJS

There’s lots of posts out there on how to correctly bundle angularjs with any of the popular bundling tools.  I happen to use MVC’s built in bundling.  It works for me now, though I am looking at using less and gruntjs for some additional functionality.  Here’s a recent problem that I solved.  

I would see an error that looked something like this: 

Module Error
error in component $injector
Failed to instantiate module myApp due to:
Error: [$injector:unpr] http://errors.angularjs.org/undefined/$injector/unpr?p0=n
at Error ()
at http://myapp.mysite.com/bundles/app?v=xmkhVlgjOx7Eo5ltsK1SZpAavJM1dB6-bg-ujblqCgc1:1:130753
at http://myapp.mysite.com/bundles/app?v=xmkhVlgjOx7Eo5ltsK1SZpAavJM1dB6-bg-ujblqCgc1:1:143147
at i (http://myapp.mysite.com/bundles/app?v=xmkhVlgjOx7Eo5ltsK1SZpAavJM1dB6-bg-ujblqCgc1:1:141792)
at Object.r [as invoke] (http://myapp.mysite.com/bundles/app?v=xmkhVlgjOx7Eo5ltsK1SZpAavJM1dB6-bg-ujblqCgc1:1:141964)
at http://myapp.mysite.com/bundles/app?v=xmkhVlgjOx7Eo5ltsK1SZpAavJM1dB6-bg-ujblqCgc1:1:141398
at Array.forEach (native)
at r (http://myapp.mysite.com/bundles/app?v=xmkhVlgjOx7Eo5ltsK1SZpAavJM1dB6-bg-ujblqCgc1:1:131065)
at p (http://myapp.mysite.com/bundles/app?v=xmkhVlgjOx7Eo5ltsK1SZpAavJM1dB6-bg-ujblqCgc1:1:141201)
at sf (http://myapp.mysite.com/bundles/app?v=xmkhVlgjOx7Eo5ltsK1SZpAavJM1dB6-bg-ujblqCgc1:1:143261

I was pretty confused as to what was going on, I’d properly configured all of my controllers so they used an array like ['$http', '$q', ...] for injection.  What was going on.  

It turned it out it was one simple error.  In my app.js which handles the routing, I had code looking like this:

myApp .config(function ($routeProvider) {
$routeProvider.

That turned out to be the issue.  Once i refactored it to look like this:

myApp.config([“$routeProvider”, function ($routeProvider) {
$routeProvider.

It then bundled nicely and no javascript errors.  I should have noticed immediately when it said a module error, but I didn’t.  Hopefully this post gets into google and helps someone else.

 

, , , ,

Leave a comment

Cleaning Up with MSpec

I use MSpec for testing my code.  I love the behavior driven approach, for me it just makes sense.  I love how if my boss asks where are we with component XYZ, I can just run all my tests and give him the output.  It shows what’s working and what’s not.  Further more, we can say or make a rule that software doesn’t have a feature until there’s an mspec test saying that it does.

I was recently working with mspec doing integration tests – these I usually do to make sure my DAL and my database are structurally compatible – and I kept getting database constraint errors when I reran tests.  It didn’t make a lot of sense as I had a cleanup section in my code and I wasn’t seeing any errors.

It turns out, that if an exception is thrown in the cleanup section you’ll never hear about it.  At least for me, it doesn’t bubble up.  Once I put a breakpoint on the first line of the cleanup I figured it out.  Previously I was thinking it wasn’t even hitting my cleanup code.  It was hitting the cleanup section however, only there was an error in that section.  Hopefully this gets into the googles and helps someone.

using Machine.Specifications;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;

namespace My.Project.Integration.Tests
{
    public class when_creating_a_mything_record
    {
        protected static IMyThingService MyThingService { get; set; }

		protected static MyThing MyThing { get; set; }
		protected static MyThing SavedMyThing { get; set; }

		Establish context = () =>
        {
            MyThing = new MyThing() {
				Name = "thing",
				Description = "thing one"
			};
			MyThingService = ServiceLocator.Instance.Locate<IMyThingService>();
        };

        Because of = () => Exception = Catch.Exception(() =>
        {
            SavedMyThing = MyThingService.Insert(MyThing);
        });

        It should_not_have_thrown_an_exception = () => Exception.ShouldBeNull();
        It should_have_an_id_that_does_not_match_guid_empty = () => SavedMyThing.ID.ShouldNotEqual(Guid.Empty);

        Cleanup after = () =>
        {
           // If this does not appear to get called. put a breakpoint here.  You may have an exception.
           MyThingService.Delete(SavedMyThing);
        };
    }
}

Leave a comment

Git Helpful Hint – Just Trust Remote branch

At work I’m often working with others.  However at times we’ll be working in our own projects (read git repositories) and then working together in common shared projects.  The repository that I’m the main developer on and all the shared ones are always up to date but there repositories that I don’t often commit to can find itself lagging quite far behind.  Recently I saw a large number of merge failures when trying to get the latest version of one of these repositories.

Essentially what I wanted to do was to say hey Git, I don’t work on this repository often so trust everything that is coming from upstream and overwrite my stuff.  There were many things that the googles suggested that I could do, but quite a few wouldn’t work in the midst of a merge failure that had already occurred.

This stackoverflow post did work though.

Here’s how you do it:

git fetch --all

git reset --hard origin/master

Be warned, this essentially tosses out all your local changes. So make sure your situation is like mine or similar before doing it.   I figured if I blogged about this, it would help me remember the next time it came up.

Leave a comment

Fearless

Yesterday amidst a sea of tears and in front of a hot grill I finished reading Fearless - http://fearlessnavyseal.com   I was delaying my grilling responsibilities for the evening to cram in the last few pages.  I love reading and recently I’ve been on a biography kick, I read Howard Wasdin’s biography a year or so ago, and Chris Kyle’s biography just this weekend.  All of these books, have been biographies of Navy Seals, and when on friday I saw a book titled “Fearless” the story of Seal Team Six member advertised on Amazon, I gave it a look and ended up buying it.

This book I knew would be different, the others were written by the SEALs themselves, sometimes with co-authors.  This book, Adam Brown’s story was written after he was killed in action.  I knew it would be sad, I knew it would be touching.  Even though I was armed with that knowledge, I wasn’t prepared for this book. I’ll challenge anyone to read the first two pages, and put the book down from there.  I couldn’t and I don’t think you’ll be able to either.

Put quite simply, I’ve never read anything that had that big an impact on me before.  I found myself in awe and inspired by Adam Brown in many parts of the book, and in full tears in many other parts.  The story is so unbelievable that it couldn’t be fiction.  The old adage or saying, that the Truth is Stranger than fiction is very true for this story.   I won’t give away details, and please if you have read this book do not post spoilers in the comments.   I would be lying if I said after reading this book I wouldn’t look at life at least a little differently.

It is a book about the military, it is a book about a warrior.  Most of all though its a book about humanity, the struggles and obstacles that Adam had to overcome.  Definitely read this book.

Leave a comment

Last Day at Telligent

Today was my last day at Telligent.

My first day was February 4th 2005.  By my math – or by datediff() in SQL – that’s 2,646 days.  I started there as a contract employee wondering how long it would last.  I remember leaving a job that I had been in for about three years when I accepted the position at Telligent.  I remember how scared I was of telecommuting.  No office, no managers to watch over me.  What’s to stop total anarchy and no work?  Well absolutely amazing people, incredible clients and the most challenging work of my career.  That’s what.  When asked by friends how I could work from home for a company so far away, and not just play games all day – they didn’t understand the work was the fun and still is up till this very day.  I worked on amazing projects right up until my final hours at Telligent.  I’ll always be more grateful to Rob Howard and Jason Alexander for giving me the opportunity in those early days.  What started as a small group of .net geeks who sometimes did meetings over xbox live & madden football morphed into something amazing.  An incredible company that I still absolutely love and know will continue to do amazing things.   I’m waiting for the day that I hear Telligent’s name in accolades on CNBC , it will come and I will cheer on that day.   To everyone at Telligent, Thank You so much for the amazing seven years.   I think I got to do almost everything a software developer could do at Telligent.  It wasn’t always easy, was sometimes very hard, but I’d not trade any of it for the world.

I look back at the time with incredible fondness and take away many memories.  I look forward to the very near future when I start my new job at Orcsweb.  I’m leaving one amazing place to go to another amazing place.  I could not be happier and couldn’t think of a better place to move on to.   Expect to hear quite a bit more from me about some of the very cool things I’ll be doing at Orcsweb.  I know I’ve been somewhat silent in the past as I worked on other non technical things, writing etc..  But there definitely will be more posts about amazing technologies and what I’m discovering along the way.

Thank You Everyone.

Scott

5 Comments

Major News Coming Soon

Keep tuned, and check back soon.  Perhaps Friday, at say 5-ish eastern time.  There’s big news coming to this blog.  No Hints and I’m not telling till Friday :)

Leave a comment

Database Schema Changes with Postgresql

I’ve been playing with linux recently and really enjoying it.  Quite quickly however I began looking for a good database to use in linux.  The obvious choice if you look around the internets is mysql.  Its the easy one to use, most people use it and as such there’s a ton of documentation etc..  Well for those that know me, the easiest and simplest way is not always what I choose.  So I began to look at postgresql.  PostgreSQL has more in common with databases like Oracle or DB2 than mysql, for some this is a minus but for me that’s a definite plus.  I’ve a lot of oracle experience and skills that can be applied to postgresql quite nicely.  One thing I absolutely love in postgres & oracle is the create or replace functionality for functions.  Gone are the days of if object_id(my_function) is not null…  now I can just create or replace my_function.  That’s killer in my opinion.  So now I’ve got my database (postgresql 9.0) and its time to create some tables.  This was a bit of a dilemma for me, in my native environment (oracle or sql server) I create database change scripts that safely and reentrantly ( you can run repeatedly w/o error) create the objects that are needed in a transactional manner.  But how to do this in postgresql?  Well this post is about what I’ve figured out, and how I’m currently solving that problem.

First, take a look at my postgresql github repository.  What I wanted was a way to have controlled, scripted database changes that were transactional.  They would either entirely succeed or entirely fail and leave the database unmodified.  Also the individual script files would have to have a way of saying I am change #X and I need change #Y to be present first.

Below is how I’ve accomplished this.  What this does is it applies database change 1.0.0 to a database that is of change 0.0.0  (see dbc_0.0.0 in the github rep for how schemaversion, and the initial record get put in place).  Hopefully this is of some help to someone.

create or replace function dbc_1_0_0() returns void as
$$
declare
_old_major integer := 0;
_old_minor integer := 0;
_old_revision integer := 0;
_major integer := 1;
_minor integer := 0;
_revision integer := 0;
_schemaname varchar := 'my-application-name';
begin
if exists(select 1 from schemaversion where major = _old_major and minor = _old_minor and revision = _old_revision and schemaname = _schemaname and current_version = true) then
create sequence user_id_seq;
create table users
(
userid int8 default nextval('user_id_seq') not null,
email varchar(64) not null,
password varchar(128) not null,
fullname varchar(64) not null,
created_date timestamptz not null,
modified_date timestamptz not null,
constraint pk_users_userid primary key (userid)
);
update schemaversion set current_version = false where major = _old_major and minor = _old_minor and revision = _old_revision and schemaname = _schemaname;
insert into schemaversion
(major,minor,revision,schemaname,installed_date,current_version)
values
(_major,_minor,_revision,_schemaname,current_timestamp, true);
else
select 'Missing prerequisite schema update ' || _schemaname || 'version ' || _major || '.' || _minor || '.' || _revision;
end if;
exception
when others then
raise exception 'caught exception - (%) - when applying update %.%.% to % ', SQLERRM, _major,_minor,_revision,_schemaname;
end
$$
language 'plpgsql';
select dbc_1_0_0();
drop function dbc_1_0_0();

 

1 Comment

Massive Data Access Library

Today I got introduced to Massive.  Massive is a data access library written by @robconery.  My co workers @jaymed and @mgroves introduced me to it, and I’ll admit I was somewhat reluctant to give it a shot.  I’m normally of the opinion that the simplest DAL is just pure simple SqlCommands, SqlConnections and a stored procedure call.  However after using Massive for a few hours, I’m really excited about this new tool in my tool chest.  Of course massive has been out for a while, I’m quite slow to the party sometimes.

Massive is great though, I love how in a single line of code I can have data back.  Alternatively I can update the table, or run any other query that I want.  This library if it did nothing else makes integration testing a breeze.  Usually for my tests I prefer to stub out a DAL repository via the interface/repository pattern.  The tests get a FakeSomethingRepository to use that uses List<TheObject> as its in memory store.  What I love about massive is that my unit tests which are really simple because they can test DataRepository.SomeList  can quickly become Integration tests with almost no additional complexity.  For me that’s Huge.  I’m trying a new thing, and that’s not to check in any code that doesn’t have full unit test coverage and Massive is making that happen in a Massive way.  (couldn’t resist)

I’m still learning about these Dynamic things, but a bit of learning never hurt anyone and I know @mgroves will help me out when I get confused.

@robconery If we ever meet, I owe you a beer! Massive is awesome!

If you’re not familiar with Massive, I urge you to check it out here.

1 Comment

Follow

Get every new post delivered to your Inbox.

Join 91 other followers