[thelist] php/mysql - inserting duplicate content

Bob Meetin bobm at dottedi.biz
Fri Apr 10 16:03:43 CDT 2009


Thus far i've tried abut a million different things including changing 
how the file is read line by line, but still getting duplicate database 
entries.

$result seems to be transparently getting double processed but I see not 
where.  The script is lean; if anyone is up to taking a look I will put 
up a text file.  I've written import scripts before without encountering 
similar problems.  Maybe this is #Followfriday and I have sinned?

-Bob

Bob Meetin wrote:
> I wrote a script to import content from a .csv file (tab delimited) into 
> the Joomla-Virtuemart shipping_rate table.  I have it set to read the 
> contents of the .csv file, then line by line if a line meets a test run 
> the mysql query.  The problem I have encountered is that it is running 
> the query twice and I'm ending up with duplicate rows for every row in 
> the .csv file. 
>
> If I add an echo statement I don't see the duplicate entry echoing 
> back.  If I echo back the query and copy/paste at the mysql prompt it 
> behaves, of course, only a single entry.
>
> So, is there something about the while loop that is messed up?
>
> <?php
>
> $file = "shipping.csv";
> $fp = fopen("$file", "r");
>
> while (!feof($fp))
> {
>   $line = fgets($fp, 4096);
>
>   $columns = explode("  ", $line);
>   $c0 = trim($columns[0]);
>   $c1 ...
>   $c2 ...
>
>   if ( $c0 == "yes" )
>   {
>     $query = "some query";
>     $result = mysql_query( $query );
>     echo "Q: $query<br />R: $result<br />";
>   }
> }
>
> ?>
>
>   

-- 
Bob Meetin
www.dottedi.biz
303-926-0167

On www.Twitter.com/bobmeetin, Facebook, www.linkedin.com/in/bobmeetin, or catch my blog at www.dottedi.biz/blog.php

Standards - you gotta love em - with so many to choose from!




More information about the thelist mailing list