The LNPA-WG's Slow Horse subcommittee held its regular monthly meeting
Tuesday morning, October 10th at Banff, Alberta, Canada. Below are draft
minutes for your review; please provide comments and corrections to Steve
Addicks by COB October 20th. Steve's e-mail address is
stephen.addicks@wcom.com <mailto:stephen.addicks@wcom.com>
.
Participants
AT&T - Beth Watkins, H.L.Gowda
Bell Canada - Chris Martin
Canadian LNP Consortium - Marian Hearn
ELI - Dennis Robins (via conf bridge)
ESI - Jim Rooks, Ron Stutheit
Neustar - Gustavo Hannecke, Marcel Champagne
Global Crossing - Therese Mooney
Qwest - Dave Garner
SBC - Charles Ryburn
Sprint - Stephanie Swanson
Telecom Software - Jean Anthony
Telcordia - John Malyar
Telus - Dan Collier
Verizon - Bob Angevine, Gary Sacra, Kevin Lewis, Sharon Bridges
Williams Communications - Lana Swalls
WorldCom - Steve Addicks
XO Communications - Jamie Sharpe
Performance Testing Approaches
The following questions were developed during today's wide-ranging
discussion of performance certification.
Who is responsible for performance testing?
1. vendor?
2. service provider?
3. both?
When is performance testing done?
1. during interoperability testing (ITP) (vendor certification)?
2. during turn-up testing (TUT)?
3. in production?
How often is performance re-certified?
1. with each new NPAC release?
2. with each new service provider release?
3. with each service provider reconfiguration?
4. to demonstrate new performance level?
How is performance testing done?
if during interoperability testing (ITN):
1. NPAC simulator
2. third-party, stand-alone simulator
3. PC simulator (to avoid connectivity issues)
if during turn-up testing (TUT):
1. third-party, stand-alone simulator
2. NPAC test mode (new)
if in production:
1. third-party, stand-alone simulator
2. NPAC test mode (new)
Can service provider inherit vendor performance certification?
Can "superior" configuration inherit performance certification?
Should the question of whether performance test is to bench mark or just to
pass/fail a specified value be re-addressed? (We had decided it was not for
bench mark purposes at our August meeting.)
Objective of this discussion is to develop a recommendation for the NAPM LLC
to use in refining future SOW language after a specific performance
requirement is developed.
SOA/NPAC Traffic Data Request
The Slow Horse subcommittee's request for a monthly SOA/NPAC traffic report
from Neustar to start once release 3.0 implemented in all U.S. regions was
forwarded to the NAPM LLC September 20th. However, because the need to send
the SOW request to Neustar is not urgent -- NPAC can't begin to collect data
until about April 2001, after release 3.0 is in production throughout U.S.
regions -- the NAPM LLC deferred consideration of the subcommittee's request
until its October 26th meeting.
SOA Performance - Complementary Data
Data on intervals measured from NPAC receipt of SOA message (before
encryption), which would be based on a time stamp not now available in
current NPAC software, was discussed briefly at our September meeting. The
discussion will resume at our December meeting.
Status Report for NANC
LSMS performance certification testing approach alternatives developed for
further discussion.
Next Meeting
The next meeting of the Slow Horse subcommittee is scheduled for the morning
of November 7th at Tampa.
Agenda for the meeting:
Review of October minutes
Report on status of SOA/NPAC Traffic Data Request - Addicks
Review performance testing approach questions developed at October meeting
Prepare NANC status report
Develop agenda for December meeting
|
|
Send mail to Web Content
with questions or comments about this web site. Copyright © 1999
Neustar, Inc. |